var/home/core/zuul-output/0000755000175000017500000000000015137063075014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137077040015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000335233315137076737020302 0ustar corecore}|ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD -/,gi.߷;U/;?FެxۻfW޾n^ /ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזLwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O %VO"d.wEр%}5zWˬQOS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7bd+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\bSQp#YI$A@EEdT+w';'A7㢢V"+aQ33^ќz9Ӂ;=^ۭ7h9 lr_qSq-XbsK، JBJbeOfOAsg31zYYy[N 1m٢ڶEͦAc?-֋6rR)? I?ytwpC'P/9} ƘwXe就9bQQ!.(GNp$d(3 %רx%z(o6jp}vE#!3M. x!0=k$}  L&T+̔6vmEl 05 D"wO>"J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?he:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fITU*q%bq@/5q0);F74$*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk]dE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUibap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >toY X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-?ˍ[~?>+8nY_tgqI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE~ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI$.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N80\+ xJpΕ`p~mg˗%F Rg(6=/r+%a>w Ohght uЍaRs ^d6GXAf?V_mW puȇ S:tŴvŀU#-*mZ5k5r)_x*8ͼx@(k:_TX%[paRu~}#Ѥr %A%`;MxB[CzR怕#H% }8@*AM.SEhd,rKrʇ)br\+! s1CtӒNc_:F*`Nv;o +4 {e6J69@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'rSBݙmn?qդ ?|o_ZV|>8.fgWD~ݏ"|BX_/AΡ1~ZXKX9^vCο -vd+OUgRy2Я\ B0!% '+AaTaBˮ}L@dr_Wf3N$Ȃ1L-%;Ƅ{dɱIX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY/FRĦ@x؂ǰA~б Xn#[r>$9;[?- й vRCxKVV+#lj@_RL;IQ8ŢΌXD@Z}\QbR9GuB/S5^fa;NwQ`LR⠑' `/ +G\ U>]B2/n2=8) B gJ3bcKo̹ʇ\B~Is 2sO/I!}xV&\b<9$4Nve^آ]7I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT= 02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-~VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLn$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~& "i 8U 7bkmW/Kez@ۦ4mNsbPeF$ןIv4 ںQئo-qrC%X%3d|O wIي=풲GnLnύ/L0<7|xkeЅiKyqe372]>/Yt7<_xʴ"=/̏p 5qЊQ5a|[q=hE^3OwsaHlۇ]iƏmZS!VA6 fya40O]dOdtێ]]g0}zFzx?#i-X"礟7u2Oprq|W礧)mI_aK=;X4[| y5noH'x ZRku P2$gϦSC7s87s9sӞ:,׎K4fTWX#Pބ1ƀI,7+p u\%B5Oں;>Ǵ `}c|?0 vmH/+Q97Kl#d![^Յ"gGBѨLaI]缩^,YzDLPp0)y? 4,CF,iq6Dz]zŴem,㲆PkEwg>c<3mi13Y` ?iN,mt#7"|5yúE/s O4m͚VƩdy RfPebE\M ^TI@|r2]|R/O.&U!E{|ea~L'gRNu]T/gwt7E5yuQQ- ֺ`y:Dy9Ia,&qW-ped I7T׉6e}KJo?>G_"16>7tX t31%h NQč 4gi1O|%g\i [zΚHr98hߓ&Õy} ~W 8I@l۱wZE< D^e^^bķwG:k> @R$(#c8O޾d&x~%y281ONчa>_$0"l$jYuXtj{7KF)>>>~O4 Y^r @/!AoHls`3k.. wלjv"$K]PN,I7(p1\p: ؿNΖ'FO_ǯOZKdXu8O>9"a0-o!{#SpEL4v.^ʓx^ 3Ngp$7W,S!GFUw>w} /Ag &`2G; ʼnXYRebEd8TW謆࣑pC,6k0@b&?h!DL󦖯%Z 0F] B?0DFߖySe9s7~b?NՔ&w,-WX>D(R39G%OGmX {SM}; 4y#إniqHgݖzdGkG($qZGI0UGpsJ>'Map %Զ3*Xm!!ս& 6mBx}m^ )$0!0lzT#a!x' N^%jPl;Z4<8_0@k . CxǪ.*&ߎO^Uq%b'wM,5 ;( ݭv]ȺGc!cyI뾠9>~/5m:X: w8(g$0 rXp@޽#aqDrie2|jfD;@c=ݫ/'@y:3d$n$uFtt")%?o=$|ۊlwt>m|xxVH-غ`C=ݿ&f[ )\MY XUkvGKjvp[L` GxԇDݫa|[<*Óz8: S*׆%3M)Yw3So lO vgiUi&yy ,8[O e^KQpFjhR-tnL٪K^ mU-n,EFR5]EQl[BMmUUh'8bA-񘗳A gngLswޏv@"4r6 t4ʍ?~}ND B[:@ BҁL'- MTcEdB-TW6 SĘZZF+Nf_ FH?Pzav#]g$ƘANo)nHH.ENʾr"c $??wF =n!L}:x Ah=Gm_rg pʛw-I4lpw;&$` xqr@f`P$esGuWMJ-?41$GWկꮯeb42GA1 r*~[ls -~ndѡthd1mȬ|uݏ4r0`}٩*3lH19/+W?<#hC/C<_DJ0yz"[|_ǹ \:G`9Տ"xZ U'ǘ4IVLsht coD~G~N )~PmU{6dSUϞWYI-~eg)gyJ-jþ?[wU;pyĪ"~!Vڤ&Zʈݳiϥ=Vg) /<Ӭ,㻃= ǃ3ק~ي#0n=F@B[u^NS;fp~G< n|xSߗ` lV% (aG[x9X'`:B]r +h %U*@i(']{C 1}Cj"TOnvX\{FP\t-@y#6Isƶ|nW~a4͙.v@3ډ[(-$@8:_'1:o?P4T윊 ( Hǭf(=]`1`stMe|- 1Hirt1qԀPu&xݰSi+vJ^gCð`?x -A#7.mk0_Ȃ:Jw%3@nG8J(Aw,n@EjǠ+Qwc] w" r\xRۡםt;@xN(:]8 E/>_]x瓫Ѥ, 垵Uy Fs|\TALb}ƒ+XMπ#ēI%G䤨fu5M`vmTՑϹ7ޑ{_)5]ʯ>[IX"dJܠGYcB!pXf_p`TYjQq [UiZR˵Ǧc֖=Ta4ki-Vfδ:OE2 )IIJO} qZꗔsϪY6NM+eE=xr~t4ҝ#47m $~<Ƽ^e:!'!;Z`qx(`jb;n:pD]=^-:_}Z~jݏ~0F"<2'1(T f"؉,X- -ƺ決 )'ŀޔr Ca:a>Le ro]pm(YW@c‚>CLkZ͠ImbfvŸ`WD_/.,-2lh!0/ꮂbT_Kgr{)8ʥc[@8l\w M`{%=FOwR8l ^è0oIqehԟ 2q?l\<0F[P7/ |HA@A@~,,!gr0)4CNq;ܳc:ta< 2: ;ͼEڃA{"Mg6Ϡ4 hp`^h)XBN]h7k'迊8E9K4+rVn9ijnP2?U Whuy^2COGV&51M@@!R.m!#OKx J:)n&D{౺ՋP3 32QŤp,9`3qWERM>*3\= C]K8WfbiK`Oǐx_3#r :F^cs&zcl^m_Y|Shޟ9Be^e|2[gӠFg;A?yr+>p ~yŤU=_`uO/|$xƩH(O"CՂgyzE?Կ'y^e)<):n3&M4P芠nٮ>:m49~X C<,#4KԟʿdnaU.upi%:֜gɌ bP(eE4-|+ƪHV_QdW)N׼4̴?~<>ùq?aYH H=_>Y򴬹ǚO) \jo=30[@yjfx>6N3LU-KY\ure: 2ɰj*`pϚ|jh2V/$9g eӫlF zq fq*ȹ#hO\pɬǽ%CG0aA2vmp׬>][V/F ]πa1ӛB/K5{,XNd'X-/ƂQxx R=)>6A<|'p~igS4&z5-ЌQ{郂샍LUT'yZsٸˋ .= a|$CU=y|V[س&뜫e͛Օ$%bt츉A#4"оFgĵԑJQ4-DrɓxkC.wd\ i7skY߶djX nVokw6f-C.iaH16k>cHaeFK1h ʒh/xx*r38ZFc8iq3ڌP ő?L̰h٥q<5اe,5$Z:jF- =W3jB$MþFL'3\eĖgq: [K~҃7+s ?fXewӄ֞?(߼ˢYI$s\i-]=t3abyYNO }=t\,Y&me&!VӕI*,R[{m@eV!yY!TJK1y(LޓQ:"g'mH+kO}A_"98cuT[!]Fpt/a% 3HXS?d#u6}9#! ふ\-60|C\zQ[5|*61j \9 |d&e^(M1(9#-),[9N ԰rZFXȫ8S'kgCU'ˏ3a%o8s#sA~%(P8LSޱS1nMQN;+7Rm#F5jV#c^X>-Ṛ5Fw2/sڨme0Q\["de6Cot(4H4YfSipe}_/@^1G[0Kx' u&B1}gM>\ʙ9=ڹ~~=N+~_КC#1Tx8C(˦?"σK/+2l#YƞLYxQ^19mt@h (u&Cm?kmX%gm(8 {ʌ(!iA{g)R|#&Τof[ʀa[y1nO&ovr{~YW'M@qtM}^ '?go= B-0`]b7kn'v/=vrǰ/[1+vxAfΙzJ=ɝL|jF/Y1GxӓY{Rnz[& N>J/kfp5 ;eMz'7jBs Nkئ7jC|3ƪ_9‡S6국SYvߜg<:N&W[gJ\N~O7pzAdp:JO8A!#!9OxJS2)f:8͒enBڷK;aİ#{0\t _FY=:#)vR!٘KZpEhZ$.{z[E`>+Ïcf8tI̪&E@9JjɑWϓTseoz7cCG5 1M;gc+SȠ^{C,^hqk>ȖE껻:rO͏u\Eӆ{W]L-SQ¥ˆ9}XX~(vn(%!Y3pC6 &Hb8}7%.\H7LB. HwFAɇ"+Ƃ YeڜVbU#G0ϯVAv1T! "$*r,|Hsp@t$).Sp#WS8lEf$Ƴjl#~i#5KF'xB"E$ =qZgKWᩊYͶf,{.([q@JD xp7!Fe+ip2*;lmx|!kڻcDtK_+FϣX!`۟'ذjEuv`1Ŝu@pu(-K0W$$^q}U.G IiMsDáqF.] Jpn#Q&Ώ=ǦU 8Uh*2,X8e-F ; #4sLqdL"ڻ; {"p .sU^mYsjY"Fh_dh̗4mFf{]~(tg/焰=(tJw1DO ]tgШd5E(u1vqD1v~ *rP!c)dU ^lA6]b$z vK!W &=q.$HϿ2"YfD9x0QD\d:UQRI%`$]2KL𡹪IyLJv[Hp'b4U6U>ޔscՅ0S8"$]t/Ip BdֲqJXHH})c_~_$9B5b9/$i}ު}{s$8b lX ɚ$jGwMеX(*BZaV I:JhS LlNºXċV@ipw,=%oI a[hpL7xDt fHC-͎>w#o=1 x"uwV=G8 \! sz1*t^((BDi4}0y?gvm!Anӹ&EB6}ΤlOW Dz2OR!nIO sv|rQ-apxLBltWnpaG1e߅ܿ1,?8heL~I>q&10^٫9%0C8 x0^咽okINt7݆+#ǏB+*n9B Z u ͤH\ғ4{3'"zjÛ̒]cIJp!$vl;8\N{<.޳i 'DNb$]Nq$ys[N+0EԆ$xyUi$1"pI< IElНSM{η[ZtZpDyR0aĐJ>x`dӒ̦8NONk=&Y+}h3"P"Ox~[ K[x҆͜4u_3 Opd𤑇#-64v'\$Hf=d8UR0bZ sYY'8V֋ͧ;OIJm% <\-V$p^,;#5{xa}%P A;&@]$ s3gQI |_F& i/sn͝ҙ):|W &өG#Ѽ3ANd/Hz7\~b嬐(ː<))MU^u%-x#4T+R;$ҥvR [,V/H3EmYd4S"LM"t]3}Ņh'NKu (&$돽B ~pe9gx a͙IWulDen,ϴŲ4jUmE~`[ z)zGD1X^"bRfF.A("yW2@T.ש9ݡ8#rJ9Z‚3e1k:"B iI6ĹDgQDoq\> W?ӷs I_7I Zw@ec߈Mc-ڎ= 雊,qSx!:pYv }sznCЖ3T:e@iLb+l#FHr%ћ"= ``4J8򬄨4GḠ1wuQ Q{c譩Au~@cc`KB싉1b7MH#U hp WLj_rػm,T5dCTݓqMC: Fא;NWRuPES)-㇇xCk\Gk6K,!ߥVv}2wJbe[ KĎcms&`CQ!J6N. /_[ KĎk {[ʵ=Y*,ug@ɰCۛ| *{1<4W AwZΰ} Q f]=0~##fT~ƎUNK =^&J~=yhvbR^x $_Nph7VUWKdc\[5Dzu#bJNok7G}0_ z'<@ߝ_x7IS_8z~L`).᫬<;&k&;Ui9탟&)B ˒`8hH<@Ό;]yƨh=4k.}6ƙe`5\^ ?͔E naE󿗅;L <ȇ!.'k@"IڼDD1sZe]KLxuq!_ED%*p M+.tqzh_g}d9<6tx_z[>iE7d\ǣ{|Qp @#pPvcQ{2>o~6>}6d8 N70Oq2:DqqD wûԓbwDu*/g43 Dm}% ^ ـ ׌/8 Kr}Lg]Oa?N{諒L}ha/Qlo$Qhmf.Ca3!]/ >Z3 s{?Np'fFqWp\2BR_ˏ+vFf|u96q/+;.(.i$f}uE;{Q0zt4muvգ}S{3t-p W&$gU(< \p}Sq~QQkKҍ6N`jHڀ[W:Qd|Y'u#*KFk!ʏoULAL'K`ݻWfR_==MK<{{9ÒI[9kܚY53V#88(Q!M<33 tI7$2#ocrd=k$piXڮ\JJYbeϦBE]* QP *e Qmmլ]}H+'_o>'d DALtHV?c-hz-DA֦R[ gMHf]ӣ ܠ9oSzZ6yZp.mRJThH HYw4 A`"Rqܩ o;WS[`| Zh3ܟY1.YΗP&AkHLrzC'ܘ'Zd|QѬ2]I!oPBx)n xqj%Ҋג}73w6w_وQxmRP*]1+;?Mt>,%?RCNY{m^HH>f].\Btė,9s8[O' ueͼ7Rc;?L]f+З7X#JR(Ak⸠p7+z .jط z`-uDi&BjD ch}`pH,jPc,0I035KQX"8+âvH#+2G^jPl}ʳ&FPC~goa!>Q&Q"[vkL#9ӶeBE ~IQo(3#؛h>|^oD4%{/1@н/5B- K!S&QK7(c~GQ _h d,3PI\3_-q͌•r, _qOI/Sxk^ԓ[× #"X^|^DK ÀMoo1d["/*ˎm5GD-K,m}NgPaߣ_#ߴBٗM/~q, 6P"aڃgn߽׍ڴ+XgU/mٽ%- XKٽtRnBīé=1D[s9^\ur!׻O$y͘6-f?x:__?݌\E R^?I^Wp-&M]Pn ᘤHZ )wIJ.F ba<) '\ t}ꖇ4Gi6t6(0Og'Zo 7P>spo; #;__3 XT&R RFTcmS$֒'9*\Y-Әb1STUKkFӎiFhcd8cQ栮h0 s\@"H[PLKD,M;jM;__3@j=f,a$8S$i@ 1 XӄJcAX-FXhBSҚѴFѴ5Cc35CtJ|I N L+I"l XpTcLH̸u+֤iG4ikق/}flf0b %@]1WH))}[6߱)diz)g3y8gPˋcʛu{Vk|䵨y^ o;W s2GF__~6>}5]0|NPPV+J0m2 A5+;34ȿ%7dU-zk?8+*xQub䒣74sv77Á;EE^o^4b vtIkKc̒"&Y7SՃ'$9>\ }wgN$̯<˫|q\n|o(pMp+@Fx=z8>f.j,WTaP&V_FYYeED!崋hE?tzS:c{UC AA?B1զcԇ7p}}V.dhAr5?v.|Kstھ+"W` Χ@Rv`\ytE,ˠta?Z6|264GKw[]FIr_mjj2P5]X_Bc$s/*@ɚ 6S|.``;wk.qbΈ0Jr4DY8%?iJ᫡$w {w)o z*–VgV|E3 he艏^MZη:< V<0ȓm^n"o.!! kOJksC5Fo*YR#%EERrA`Vh&JG&*K_:S Jatm4ꙁ0_U7⛫ZF+ L)Ulj؅ijC˒dԛn"42+rdqwv\#bƖi+FbsAw)gR˔(*(KBX6><&mMɢx~j"Ο_ELJxrwy%yjwy5qZwyi9Ax#̷pfRcB03$ % : M_B!KgR: q5Z0g9U r,x웰4X*џT4xmm6i"q, Db$cPbK#V$1LhNS3'H"*o1屰3MKX6u1I8N 7.srpc9+:Y"x`,T~h VH Pj͙cBE8I$rRiR(ecELwcl̜t">F{qjc7,,pҦaR J% |k DQcMHO(mU ՝)`@0|G hld,TՄ!1NDhfb"aB,jDS&ii!OQop0y XTRk&~T=B0$Agߌ~9&e[ vH>|7 qϪŸJ?f|=ߚ> Rn?u5-U(?VW 0H ɋO96QWf\+Fe ?Б+.fm lifMZ~vށ[.wAO<=3v6|֍[W"œ;;縨^ #BR%h ]CksL L%^k; -ad":L  gb uXbAXnRL1(u/([ >ZFRe$fY2gm#+~hP_W4).5 $8X/䊒SfHIdEsnQ&gfgvfvgg`{ GdB%sxll*0r7d߽o le^M|*`H<8UtR X9 fqnQ9OR{40D@%T$,32b4q҈}06&8Sr*!Ӗ+F70  Q$4IpeRL5X#1-*ԁ5i?y` ~s7*F+Y $ਤ.uc)q<3[I 7Z!E 6 0?l ӌ K4S&j4?zIfp&:8A2Ro,Rr$xy}*Tfu'CMTzɚ1S}&X6;=b5~[HjqX a'R,ESx+chHŰ|p:8d;5RYMuF*He6]NݑJÿ62(OeƎ^2"ʌ=Inb 8εkՃkMW2=djhovRY"*ӢOfO|}RFtK5-K6s`dI2zK8 IڧzNՁ9o8YSQ:d ۛ8y8ݔ+CUӅTI % 3Iʹ$2I Q&#{q32s ڠbuz{|-YVA:0 c`v\i9Z+h#-m.(qDZ9t:B$Fq $"b tȷ:k).f4p;J"hHl1YG#FZӞzl!WBZIASE$ TwF4΋=^B HT5?h˓%@cK*i|'WL҈XػM)~N R |%6z0P8yL5)*+FӄfLca .0lYɬk+ݜ*+dzT)~Q ~pv{W]wx⭠͵4hn::!L3CB -UZlwVR u8)sJ^tڇ^LB2]LKMGQ/}V1r+f7Od;y[ae'U׶'V(ҨHYkԊu(5+BXcJ=\ueLZnIᝒ#eWj#!B:~f-2?Y Um՜[ݡQ m:3 \IڝH_B bN/˖#V!BYˬA5Uc[ &cSjnR=Ү*&<\zgɓ,A CW,+;c bO·*=+ME% 6>Z)kZZS2ӾɮՖzpUkӐ9Ez'm+z6[6@ L#çwr,4f^YtSeGE=*)wjűOadvU8fBDZٰvZ̤\LG5n%ZlT}`ի% zB 0@X$x4_o^,l\_?O0˟x6YNՋ}Z+ ۂ5tpŋޯ+?7,)*RK̸Sq|YJ570F%_BK ̩htjQ_6bيmܯV&j?޽{lzekWoK_#{(uЮ]҅}}+_~\<6(<\ґa2~ 0L'WE+<#nAxt@]ŧ\ .8%'ڤOu%2ռ4ZJ/K9gI)mT\xm8&W~JNT  7~ٜkkq۷)`kbxKYؼؾM)5q\B- s[Q F`J_d幭Zpmy9<*4' O5JQvs8V[y\Pqu"k7ՀiY mׁ˨V]j,\U`k80!J`V֟b]Ċwx6;{dSckX-V_lpmY5{P~[b-zem=`U0KR'^nՕkq+bo)qHr4w|?heJM)ݴ9Y޼ȅ  Jc{c~mQWK>r2詾ț4g8 P5i?YOp:Wb>S1 S\ukKEaGȊaD <6<8΋s8d4#wUFG?tɪt•AA%&,Y .=E;H!:M'?QXiu龂;m+u)_G&_L~>AqvZV>:KqϹ1@t0SxM~P?a76In '{o5imC[}*XSpp*ŊO-Rlo<&Tݬq (fkUG]S֠$W?B&ӗ]2v"G7)'7 ڇ TNZS"w-PlCԗOA60+۳\0Zua9|!_s }i2QյRc 6'bz 0𖹝#`rΧ,.q2H4A'g/ҤL[d8[]/ƋOIld*D愴'Y}}lսh\#~3dG09 77\P((&K4Ј.CZkC:' bTePf4E|NEl4ϣZ'p&@{x=cAV*ޔ]!*Sև;o |3MV+`Q)M| bth:oc&#dn?<YC+t <G\N뵤.M8R2^74Մ mcgnB!ꑨ8V5Ktei,ydc_ l=+ LhbcԂ9,u:&ˈGM )(#ib$z 3\kr7Ptfǭ9?|7-,YW>KUH2^ˆךq\QXyW=Br֞;oVv$> yaf4hBƜpQ6O4VF)ZuVW r.JћIeZŲkD[uߊ>z ?>,,Fcg7WNJ:ӉaWaS_Ŏy 1ikRp/ p ; w4NY2ɋ;Xe@Nޭ6ǏCoҿ^?v2f1)8 [ן.[AnXA]K~ֲ>1^"2:R"ZgH}LE|)$ƕ5?Y¤& 7RGD%6ЌA Z;2)>$5隞k8c>\4^jAy-1e+pgd|?/.O|w*.ƾ*!O]aྜI4vΣGO*s/ e6<~TC kׂ)go0,嵼T[&ϗ/#j\f9^Kl /܈|iM܊X- z9Ι2ϓqɘl[ez}.GۥN~)PťQzQC S+qm˩h,uU10ҳsT@66߲z0z @L KV >o ^hP Zx69L:>9߆SOu''N'v{|"xy@߆-Dy.x9\J86 dRd3AU&2UcєZcXjsؓ5{ɹu30Ի_w~ܟYzo^۝n,׌?vxas,dRpocCEHxrĊ`Ⱦ l 3LDGvt#Zd\9T0Rwe͍#t'bkcw7{N5G1[}z'xHUOe"F U_Fpe\+}) ^d>#l!1]!QN&SmҚ?y57zյP_??˧C~rMM:EQ1v禹>l͓AǏ:{mDHad"P Qf8%/x0g7oṰLŻ3k1+A2-H4qY&iτ&{bG>RBT, 叐oO|G45> TG颐,1PuDȉʠXbԁeWö*8"]wUVۻ!35S9-IJxxyUXCVumPhc8Sd;OvjWW: Zݜ{"pRkjX0<: |ǢM_Oxk8#$ީbtT!J(ʜhaGGP9ݩM ޷:kn)3T/ 9{S_NŐQjw$E2X?'fn'0d)AL88.ul ̣W,OsnyT#3o"25h;ȃy ~6ʹ !En v."o[oHVOH׀Cvzoy , q -gk'|E/)+ޭ(F^(bn_-VPoӁ@0;"@JPO`qhU"` HT4j<8q`rg4R<QD򔈵ASD$=2Z Gץ"rc&Se/$%UCFcHZ>p$`њ_''u"t(Jr:a3%l^ 3c˜(уYTN[@C!D:|["E>ROwWJ>|OˌY(&aR^@`NFڑ3(}GSبB{B!&>%7awhKKKEMQ"$ȬHJ; $yKDC N^w"JC/4UD0b:zwثãrH8܌XCkqp^|fDhŴ4p {bG>RٶL--|`CxʴM]c6tZ$ϤG^Ǫܹjf]hУyt4W>o =1ven<ܭ#FN(u<}0+N|pARKw![ϞѢuPwC=,odY!h~K[EvQX~oaك1;Wp[wvˇ.o5f|١WW٭w^(֫>?fo6qH#1n!n%rTxvG|o8曐s(!ކ!0J5U1a`HM#qMȅ PWa&VG~jO54RR>Ac5)5a ~Aޯ? \S<(''r:r:G3;g》/+b+iMY(ɍoEP,q1 jOd^6֘CuISԤiIBĭWI.!(|7"jUUy903 $pF}zN ̣ݐ.|vO*'p?K0u93aS >؋&0 X۶ NRRpbY_lݍ4{¸{dԜ:cP=/b‰*񢦔ʻj+I}qԇNAz6؁ʥ'-5ϙyt%<Jy"Fwڼij`E ( qW59y$^SUTgcO3v2ZGy"{}a3 C+Rl!p.z<@NϘ%tQexUF!wC\uUjQ!P&>g"O1}Ip3]no6d=q#>}Hy u[x["86ߵrADy16q}tT]z4r>ݯV rTGF1?P֕z%OٛzyX<3ίnqᦽ> :&29PrP=!)RiZ 9tv{Qg7 /FN85E`9 x#X$4MLς܅֛b)cGFՇCUIbOWngZo|1 :Fy;CQl_Mb`LB[/x^ݔK BPW(q6:bB?-][Qg0*Vt͡"%gb H@Q}KS@M 'Oy@@2}ɼft&np&G(.CUuD[Jtf:%eBQ' ErM4$d8&5{o`6||]I7vHZwGF-r./s.m=;].QfQKI9۔=؉x;E>VOY@YBX_91hj&,OmJhjJpVT0SO &$Ѥ´^ֶt;mu$O eͭoIJQҞ&dh6PcUnocIH!SW A,*cC]#E>Rv0lk'̂>?5=~\1I24_妏'L" |hGǪ|BRk}u_g *UY&C`g1$|"̂m|½ !P#쨚rDl6#[d l +0uU0 l=c;N:LXnE>VS-ޔ@&q*f}qo F @tF[' 7o 9beOv2(h%L1Šݿq Y87ES(OSXM{I}a&2u}#NxTGDĈ&Da##cr qcU.KW9#b,SBE$%×i=?j$'ߖ8g8^a;r^D[`ѭGUe5tg0G<6"ءd etq\86|ل'*Vkz XEl\9GFi5 NK2}\}{|Z&!+20f8&>{b$%3F>9rĶ-G<-G-1]Dǘ쀉2n/va0 z|wVwwLzO`N4త aJKLmN^G\TԤ },$IY\.A˶ft!t(&)4ALj~xL÷U-⹮jQcUkM51TLx.:tlu+'GpC'<PѠx؁}A1&53 Z1혠+;x ԄQ券Vkىڷ=3%kܾu鬹a*CXN䌗pBN_(C>Rz¢K& 4U3ҺGXm8R,a!]5߬Px{dl# ˓맍#9k|lfKȈ/~ĕ$]'`"Y3aˍǍ>:{d|:Χtr Z|X 91cI1L:ǑոU Tu WkU?q5\anʉH2pܥ@(o\񯆢cj 맇@OVyxa5nC=pdx!Z66.1Vc&բ*Lu #(9U [ :φH+L͗$T_+~X !fEL8eJhG"r1af uȁoŁN'Ypi&O۬԰dBglҝ?! ޫH,`՜ '|>-h d<(**'4A %> ؖgb)>'4"]gpi AM;ÿKxۗ E>V:_~o)A[p>Wzd,Vϛ=^qsuƺ+4ǹ2\~/v@*GFɟk]qڣ |mrc#bܴm~+ >_qd<r3QEPZ+ar<ǘMPcj E~MrAi#ޘ\L8Ss9<<Y?ϿfǿsYZKPeT< =qZzrj#Ꜳ?߯?~t~c*Z:/#eӕO DVDN,=,fVc{4r5cӺ|x#I \ ZvZ Ƃ.i줁[[k̩G{'=L< |ʕnjN p5a3VM'Mik󬨚oMMNDٛþ5zvIM,Toz6m"j%S~'U9GW(*Wv <^!fJ:}TҡzdԜ̠*+'+f(0;q~$f*1Íd5mZ|ɦH0zhWӀH9Qqi#<͙[mn$G77E>VVg iYN9N\hmDQ1h'l?g3VyFzDU֗䇪M ;: .GF5J׸x}k76*GF͙%Nǃ~0d47sd7sVӓYdW 5ĥ_ 0~`<ٮ̦ M8-IJ0IEJOzd?{Hn cNVȘ{Xl)N^+lKgnGQdم{L0? H$2]*8R !W;-MԭT0l䰂M=#Gd 7 2\c3~\J1p=FT𕫢ljViPz+ VxD32a+2rW k+%ΜnTw@*ul@7;d> mᛁd|KC2\Ieoj?j1:A|/OmZ0( ؆wwzShќ25E2yyO4gGKoux,Z$,o9ZL.lvXh.F mmITA:g,& D~ OgHq ҫbոl6 "c`@M xJw^'X1QZh*RVRʸ}T0UynOJ Ul* y; APW ˇU]G< ^ ĩO/5F {TԂ[>cd;qv5ٗ}w|9`NhGFԎ6Ad7Aw<^3SX'6:/<; \K@޻4 Ay8}qhQgKA U6uD "a #}MBYíy.d|e]X &`/5J16%J!IG{,qӀxJ+,N9?-j4B 6>!nB29¸΄zkhܬܢ&?L*JZg mo!Q$WaPoV.g2ju웉ua}w!~qvW`cMN*}dNKǑt 70B`Cu0z9w s:a׋ 6yz{%7y~ͩd&z,C8^iMI>;\x&>z^r伌~eEW`!vvF/v,0Hc,'06. @VQJ3^Aɹ͇]έx`)<.%y7$PV A8@$s&Sލd w`j*WD R9~ SWf? MxՅgx`0^_/>2t>fEHp*[`D&pϿ"xPUQSl"nv0w1^usWx5čF׫; ؓVa=zg,PUnhm($ *I'|}* y;,A0U5>AEt{u?IN;[\IB;OІqd` Vv\lvZ$ ea߾{̍Fħ'yGӮ.(5igezE"o!jKabK݀Z\>4T$ %>H !xddH;1 G%Z8Hs$B/d_xV MpE_BkC'hiHyxAL1y54g£Jy/V珏jns#,kZ!8_Xt\V'@2 QYSχQi?}6Z,o<^%q'#T׊Z!Ri:=ןG?LF1K Q?"#2biW:PURС; &~ |iVW!F?QaװMZ㤈҃x:Z=d\f `EWLX?n2\~] -}'m m_:LBuPӶO6!%U<[q˫Qǽ!`L*"9cY鯨nFN9F*{#yo)wH^SC@|{;~Ӳ.`9!]gG땟Ag uȝtyuGrчS.rW%WEEI(?jy!_fyLxfM1e@yF=1oqA *g>憲bvmmAtv- i 1ٚ~awuXE7' 2P1뼅3d DgSR~nQt= cra\ gq1n4z^1+(:x h}oߩ]A͐ J><]Gڕ2pk '܇<%Pڗj=帰ˠ,췺pF~5f,|lf6ت£r*W3M>z2K!7;G?H8*WWH}a#>x:1Frzt>7/Ga8i|yNHmL/,Ut0ZOQ5G^d`'qgS,?5=Y8*Ǔ0 -˳e>_lD4e6P4C"QHt]IifE=00>)r11:.K$qky$,mK}:xx2+ˈNY8qZZ !!PCoI=!K=WT\8< ᾿JUNgN&Н,{l)Zp`uInؗfGyRfwBNKjߒBI4V"N),[]ݽO}PջW|3oSx ˈhe@?#) ) kN*alm ʶ>bܱ&%",qO&˻+lбn^L{Mc_ar<䱐#Yjů/޿~sKB*."nFFʑ"2 %}nx|}hYkM(Pڒ EB!!FFѢii9q74.\ jb&N8nJ\F=EV38e( eQĆdT$"sK޳6#W ;ERdb} fzvcAITc-%-J%ے]mTbU$Fļ!8-'/K #T&%t2?A]ՁQ%҄q>ު ]; 5(>۴C{e&$̌|:e$tYL` 0Q/<+jHđ8R(#@2sJ 0yQIH|v2 3['Y[.SJA& z<-S }e̵Y Vcbn"Qƅ>$&QMP OW&ypIAҷK ZuYBwa?I'~ f!;`S0bDA@zp\8WVe&?C腇~2sIw"C_.wiv!Ft09.W(W=8.L3븋Ocu^P]TJ *u-"T&ĕH% %2Yp*ۨ%8mMd*gu,%rW-ry`/\2grFjn2gU' S+Rޗ*qet7W`RjL n}93 ;Vvu>6c_:jg \ί~_C%P! DyIԦn4`g`uhG48R 72R)$4 Xv&E^QlJjDi+* @(U:0qejM)1i ^c6_me$h k^<."H~m@ x/60D,<%\Di a[gDg0 XwL] Jhy1WaTM9hV+twc*WjQ|^rf' (F@7\juQw0z⟩B*j%ozJbJ:I')hj~E{Ѫ=y_^QUɣNن~D݁!+* h(`p4_q%"屪ظ. \$j\yD FIi֑ aH%*We # nUiwDEXRNED'TḠ٩w|5F:=FLz<!C# 1}? BiR#5db.5)0DZQ"(YyIswk V´ x_ mPP\Bp k(Ԙ3I]ދ0Mտ^%(n =)tGjPx0'~ᴌ5~?,FwXY)1'YA/Cmܬ.O,Vjh:~<~>Ũ`6=eQ̺KУpgzrqQ9ǿVs]}Yd+abPMbk1ℳ8sc\N70ߕA%oQf&Su)F4U9M `p{ 3&TlZ2>0X._Z+[}ƚoƹi=>E;s1wWBA޵8bgjw=s+Q^TVj5˴T!NJC:[9v϶`(#t;VKکeԵ@[ؼoUWO*k .;կ$0/$y<"E&xh(k 6aȀ~ZCŹ5骐5맪cև1ksZ}o =AA\U>;vL\ŕ%2?3_9""W$)S#1pƻwwc-푣tU1QAĔW?D"ݭeZpk8Z~3~3 ?A3 S_,mr n֓Xr€f%:Z'QqWd-9WHnb=#`WX^R_w}m 臾y:nPEZ[ !5P#Qz 2֡ *:YPA֓(ƅIKWL[3~1pTt 8%"*o]eM0"J3nI{gK4G+qc9Zuc ` D2& $InS2 u{,[b{zJ_ShR{0Yq 0(8B[ h2/,#q,2)S˜NW,S]m[k}D} ӯ+c{h,46H{d4)WY T Lk. EiJI|4{O+bYF{OBQx89etr5 S *RϞSNRc糃7]9[s)z1UQ#=H =z,P7%qcqB|A 0uXj aHE4֊D:y8{si9H>НcwTڝv|?kеFh4`t-lJz4$4YJ@%@M6׮Ft-P9T?qlRIqNl$)Ǵ]7~wm6 Ukc+oOxnf 2|r q Z/'4.+PX6))L>%kv|wvZP\3oNb}"XVf/ ru c1)ё`9o7+"9DnQiC0 rXzvܞIXl|߲ f)u*JO!qIX@Mf4͒딿5Wu<& >.V ݴLf1<,1PlA;W3acFh?cИr}ޞ, ;3KOg岍AsFuK"vf~>8j=*ko/ .Q`j@zOՃ~[f쑲?{;>dE̒TfHrŔPP ,I4E(%&b4ZGp $k'*6qJey>_I`ƹûMSfW ua}7~eaz&nsunOo'Ў31Թg l ,H`6 7O9hqYA|}+.n !.9]Q;)u*h#GQQ'e1x=`$fJ,%\ڔ(e$a,ad1-J"ZCIO $UW-bj K;\nڰ[Ĝ_^sqO#^՗X˩'ֆTŖZCtb1ixx@"hQD1}aOҗ ʘЃ͝Ξ4kZ3 u>׃[<@׃ERUhgsnE%8W`@; RaxBAIp?D+|oNEIFe&"DTP(jTrցWCՅ71a~8G hJf0a\a\TB6q>T<(YA{? e*O9$`$=i =nHV$GyLңR)\ U\{ '8'#qsǾRȏ97:]'gY2/TQ_ۯC0 %(A᎔:0yYfؗknT 5SQV8ӑz' z$SYÃ\эqG|Yo?Ohhd1sN‰aI@L@LVá{r] 5wQ2h -.lO=&!8'xs^)磙;[1-*a8b>zZs{ݲ_8-ގ5)݇>y YtߖO&irp}~V-7-Gnߖ5:2ZG U|Y5Ÿ֤_)̲X;~Af:j> `Я_tc9MM^_Y^y?yli"m4ܮ_MG0a⭿YG0䣗Iq?z-]p;Vh5ju^A,C ϊ՛_l{[m|z2_ܶԬ%&$7dPP{?aB 6&T*Ti- L8ߒXTRi“LT8RLxnݝ 1y,…Q~i3dNxn`W!pV?GPֳAt'ZVIՍ9xTMc٥nkmHn0v/03&y}SBR>#ݤ(@l"ߩ{W}A _Eh\Ay `6)8¡sJGh Z e# bbqoρ\rX[4&WB~ɛ秄d:WGp `y|\| Q<1C*&H-$2 FԦg4d=}utl'4.NofGAdn?psw4ݾETjzD%oG[9kGgҿ_{P<|S{#N,G1w^#M\1FTٺNRjg#8H!i5`b;*4qWȃrnJ[.MPz4Qw8O8!R-{wX)ʏT98)hI#ML9xԔ pbzjj\c> &H,3KP$`q rsTԯ&7em2Wo`(FGτV()5`qn ̙C3yX $xdEǼ+D;h4JKl9` #ErwHhL6Vo ~**: r0>QΧ'4R-=Pۡ ~l[8QX&{{<'k<2\8bTf PH ,( ѝ^[7ƅW\ ĝ!-ҠP<׆MCqM18p1l|6AW;+)c(5JX %41 a%TF u w.0.b9ya!"O^:%zLci&{).1шxe0ArkOj­0կsyS.K*ܚSX`Q9tM-8TtwõcfEeo ((1yAڀ>RR%A>XU FO `PJ NC1b8 g/eG(4#eup>uDQbדe4ƒT43cQѼ:, BBXCp)$MOqM18g,z[@dԥ (N(B< Ӝre 6vh 6{%1AEV8blB5m@h2p8Ns@h}|,#E14#auA18뇏iAiYF04CLAR@Xh N>rp.hIFl*^)M^J148xoeTB&Hi Uɸ@)"ѲmQP8OW"d bsƈT ODWr</ m G^|y/ByCV7MtH=&s9n 㴲F]i I%c(6Y7RĔHKlyU!2>JWCuN"%SD8pȑ<6];!Ԫ6o >I '.*ʍ*PȀ9,FY"cGE5ucL`1CS L"!}LpI8]/8)-EJM8FZ*8!Id=~J%>H;h4Ͼ~a|qt-%ǫA2 lmmAJΐQ +%ayBD;h4E/ ME"& @Ϗ Y+ 6jFcp*Dy^bX"!@ "F !'*ٲ~kP*::9Ȃ/v<M*/faP~y K ZMGR0YB&q^lr˚Fcp*\ZMlX9}BSDپ~c\Lxz/)syjD0_(`!53RЀvh9٢3退kdBUR ?".p:a\ ^I:jMVc?aQLL ۬samUҘ#$2~A6;';LY S&A 4S!oRZu 8MࠠFEzu>k'**Q5;=Y5 W>K"Pvh GCPX}J1r]fWK;m af2IZ+nT2@DD.m zXuD_ŀEwh TCr fgGXu%aCiQiQ41P*VMB%vh m!m31"1"^1^ќz|;h46b<6.#rGwTmAqrm Wv'@x܌C'Izm "hv3PtףBpH"xTiΆ$wo  Mo|g}QЌ6_f7{@E_޵Ȏ#I<.$vp=/nrt9^~zUt3?ُS0sM4˂4Y׏>ڥ??]3&oÏ_gNXF#)6<8$YkY-7^㎟n%68V1i+'v=JZtIA?7u R#Xt_F{C?Ww6/(\w(u,8.F23x2J z@v. @A)1wzF",iD߽]#n}s?>N_ϵLg[c8_lEq,Uq~*}6 )1ڷiJ_NNƋhFfq4E/mJ37%KV}L2'2 #rk<\4םވKSAPߓY%؞O1nozJ{)M73E%(9 JTP9u*j_9ׅWkx#"wυA<=j>5w{VՉڽeReo;\qR." ;E8ˑ׹+H4[v."h9+BimExv߽Lav= z?۫ 8l4svw˛N?h:~~^Nny/y@jx݃\ A)fs=K3;Rg9>%{T6s/U;?r'+n^$F~P9lûV=h_yn=Wӻ>ZAўQQX r#h1gO@YpLfU]أS 5Jmk(wc~j`^_-YmL2%)$Ͷy!V'k|'Ju[&Wd~Տøq4L/O K%*I&%ld2+H2u-L/GtdzG%D6qD -fZFgso.^ '$.`~x(^gXs`n6L3?!0O3(auFo lz~z7Wlۉ"shs55gߙo(/}"./Mw57l^^x z| 56oZlPa^7J/g+w%mvH}~N  $6>%,_HXj۲Cau3U_duK:s| U^w2w(YеshUՂEh)&XoEB JW| $Tois0L$X kV(ؖѾ൭>uIIa[|wPTD$]/š REUR;H^U_(ЦD3RYf$HFq+ nEB J^ -;H^4w*୩o8J8/^(+Њiie"xKyU]-;H턶Ϛx7tl}$|pp# y~6TpDD<ϟ7Y}S~>o,?V% גcD\8*7(/s)y+)Giмo8g-/קMh|is=g<~]x6]͹VʏSSy-1T퟾Cd?SIz5eS8N=$4꡽r.S]؜8yZ+T\fF̔YPf|,j-q*JaIJdYl#ׅoC\sj0E9& ?YBQ$g 6owo ;#ww9za8n.|j:4[ Ѽ9CZwζ-5_]FK筣˞.g]gW0qLESAчeP =A iǾ%|nM0x4!~PФ$n"kisF,K|~|~"Ia.xF6+ `h%f=)S9$c+ëڙ [@24c6"6XH #/JjXLމvV;YS[b$) s:%G{E&3JDJ8Z T5ןxFX)(W.wZavP*&xΔ`FժF|xZ'I]~`~\?cڞu1j &|g%{0lwqwbY_2X4'NRZ : A 4YXHOk:,{gǍ9K7T̔w:\@Yg-cyVƣϫ0|u  ӎ .O^\gXVuègS;Y[bg]OV6>2"htxa"QhBd< 2ǜ'ГNQ<6dlYXG5 u֡Na0:,.!%duɆIO.TckqjEʲuw@Nj#Fg(( 8_z]F97?`4X{h]JZ$=t" 3+CӊxuP[M+겓`s%#zlf6$<%j4u)nRU6" ^6P˛ XaXD&dM>CӖA&U?fb;;!0O^w.KU ,买cONZ "@W喸.jୈ0R4q4qYVN{&tv 5+]=$3ZDl By0[|HnrmYk"*O[LQ! ,6tPv[$ "SWgUTm;H U; KYwP<"W9tMsa*y>| 5cgmI*ţ!1E$O8QY K1Q:/[wPbW:k%OyO*d ?@<(ͱRN^=wAgٲbbtujQC7]SD p|Sh0)#,enT0Q89cY;nqqQA+cϙqNf\`]Ěj!T)s :$XI&Y5BF#Dz'/`}76yXyL/2'L(a1*OU{(J4.N͏( 4p!2^c3kwEљ%Jci*Z&,I5 f-ZZܻcPj0˝uX;мEۄ82YdOF$)pf+0h4yJ4`9EgoGjjbHT 'FK#dzɔX(`engѪ=s+O irU&p:;t__%N|;#{ϗ6o>0]N_>>~kJʵ}fcfκ[_F[R\H`V9ԩ|ѻ=3&h!g~6Wal6͹:CGV[M%|4AH͇k2Hn \%uG:L<0w&g|U~ۛf(kh J?Zxޟ_MYSӦ*hr-4IϬNnM KF5%yyôfoՁ'fgNl:sppqmVUo4dtҏ67wG$GZ̟[pKWǚǛh38Ѹ%8G÷=۶9ogV z +66r\LFaxQ4 X>;c=D GjЉ/ Fӽ8\:O_~oϧ_g߾|ӗۗO˧>y/p2Ǻ>W~_=:p<M +;4YMԫޥ]Yvoܴ,q37$ċWYkޥ?o;kn ( tjSx$D4Zn;ūe&_f9N,ڒۙn圼^adDS}V+ǀ:Efq95>)4&K 8SB{kOY0`# , Fd(ǔR0 c&NoeyPt34˻O>*h8@INGLB.“@'R6XJF}a&.j8GqWHUqV"}4Q]jJơ˜ P+Od`\֕i "x ^ CHD̘͝B,`_ 5 IP!y΁H9U %E*MK[!.j׬ ʑX0Hx )Җ*aS|U$Y 1!XH6@QRر^$ ԐtΠ`$d-;HJQG*r֚b2XMFF|F"s'.jwP#ڦDdҒ aJXdl*Mlf+ '#ց&b]$TϩQ^.)Q,a x`sL_,,]ĸdCUf8[>HVa\׎jJ7H4 k@b) 2Ž]$/h_+@~."-|: .b "qFI-R7fy0__Ӕ!!h}ǻt. ~C<@"Z8r[>fA ,x/e,Rue<+"d#9sISD2S6 ?-}lxo shwri>̽͝Լ UPepo<=r8wo{gѫȞ1{rݷ|10O;mfO,iv^j6oc1F*xdӛZ콶l:{?|،͎Fv~iٴ43afݳ_CT. (sO4O٩tb:Y[(s>]SɧuE蘒41X,!ZyKSa(xO\\g`FrC#9y0w|o ˬ=":/f*S߼[+5ثU-:ɴ?^!=@WH#Yw/iq)= VnK=F(;OKK1ŝzkDxy]gaR⼵XY'AK%F0~Z}kj 9;[+ahCdn[r`aotj7pVUZ|8Lm\6*KҎRU;LմCWga^3FudUAחr"(*g$3m2RG x퉲}Vb}ܭOnW1yeT'u"$S.)YDQsֈ5N$$}nǙ9i;!A[-?pH7<;PVmuOL03M%fӷxXTuu=ғu5RJߐE*17ekhuin\58XqbWsTg !-o7 9]+O ,Wzf1}C{;]7ߘ.G}/E`0h Kb΋eU`l. ˅]%NXi],#e3p!+ i!2&dὦf2]粠pv8݌9],G$hǏtm@6a)jr헏?񠗾 tzHÛ\ӛdfBlt'(jEiMhOYcM*{^K$5Z ˊқiJ~}S̉ K,BL41i2%x ASzV˟RYccAh"%" dRXkE 9]L6:F8;MFORtmd2õ02)5&#C@ᤧNgۜ춯`NeioU&--{ 2.J+d+,<+ӜDs6B0?xLd<KBH+P,A&RD@a!)0ˬQ 37l7d%Z`jZn:4MCcIM ʙ?~iZf/(+M*oxa/ Ya10rG ڊ͔leB(//Rm[oBQlu$,xA 4J_v93"%ڈm]PSru[aq.f=;W4&q k'Lo6lBl@&j4=^ZhS-T6բMhS-TUmERQ6=fB(mX-T6MZmEjѦդ1&RX)N7XvEaB QF(066XF(ziQZr8nfg]~LBU=Rv#`*$T}Ydj >_fO87PHZx6q zS GYD@fKNR9[-v> z asgJ2TBO& n2 2h$:3DzZH,$ȉ^(DaIZ[2!&%mdНYgRZLi%koХmT(qdbsb4dPZdgLBA6gLQ t__O-駖: X& !̄Of@ҷI62 OjuxP'͔w(鉆Rvq>{䩅 ?3"e@3 E sUd-. F%MV}nQ_lݱZM(2VD^oۍkv7o'|;*""$iPzC82J VUØ1Vm."B}8OMI\/z\3ASB=|4\,XJ@*H\$/Px тvRH",&TE=f` gE;:jv#A/FŰ7s| /Gfc7ێ]J->8f%eVۥkFZ QDj.\!@r h:5%&Ob4)ds2pg6 A^|y2yN'!G11)+ ـ& ɺFܥą`Qrkl#FbR1 쭏-5x;E9N`aO|-y/)c8-.CMK1tS 0\x EO PsN0zK U,U B5@LdA60XW$H!)E4yLj](ܣ4,=}GU9eJl })dݛ6˺J2Yui9KFTA*O"aXiER܌c28 H\#GĴ$ϲl98/skC"="RIBT n"?ɰ%x){7A}\L+O7+w3s\K|HJ6HH7E'(KoIzԇQy#qCMJw˰Ft{1I^fzFd; %ʣLu7^Da4clw0-]\O7_'g n43F{aɫ=록/HI5_z c>ݛ<6`ɝ. h-u}t<[iv.TClri[HXN5#73ڷyaDZ]h,t8ux իAkuXKvꨜMW:RFjZzW}{#K ^H&1yY8 c?|~X~zwϿ~O\~wd@305+Žu]l.}OCӦ-QO'|v!i`.u?lp@R~z0~B)tyO;v^ٕ +0ݞ~Mj~Lm+ˤ^Il#&DKzV<&@`]dcٮ{wr"dBA,F#ly0 -'jk.0 ]D 0G1yl/jxZfiE;6,.fȌ,B AHbtRJL0R#c"uW]g_ۙLͼ\ABy;V=c|PևG0c6wLe˺x؊2U? dge(1*X+5^-D J뜍HrbԅϺR[Y)ÐN (ѠKQ1Ygݹ]y?g~|٥L&އYQjc#dL*#=KwYe7!HՔ)2%^EdAn&gwpz%s-Y;Hى[@ZP#I 9N0]LĶȌ0:;(mi ba:U@>c eIi|UX1 8"La:t:J8n ֋gfH/åYӒFB=Jb9,O1NL[рe\h\YsܗĽ6mg!m{Ыf,.,iW$iѡ`401MI_ҸbTzZmf7ƺBr @WiOݸ*MZa[ӤWƺe α`4鼟/#h/*u_jOogr`FZx`WF؎t8z2dg\)wa2V=`kFXyq+GŭU+/+BQlu(6h1@% $R!{I4mH*Re2VFH[i+#me}EE4qrAiJ[iu+nխVVZWA򗤑\U[y{+ooVު)rV[y{+ooյ[y{p e) ,:hgAPFMMI4[c5BC~p6uЦtpTy<=CD͍Dͭ1R=R`jK_yzwJAҚݔ#" $'/kgB_bPj(5wG]ydO)jE5m1kVeOkM _ )$ t5dBSfB FXN~.EPˌ))x5yw^jPJ1R;ڎřD9V'ͫ6}ǮaieNHFD0qɅw*R d"IK"K&5Ǫ(hژR2c,.Og7j/_hd K0:C0l|2 hHTiQ;^ Ȥט o$Nzz*znE]ioG+}Id G&bbmS"r[K)! }TwuC"e֖p;1ҥRO,-Q5Ƽ{eW3Blݬ:] F+P=Ui[ttOHk魡#0a?Qr?? J-Ǹ.O/c0dp0@%.-`ucʠZ4vԷ2[otHa݄TW.PHh՞2#,RFXp]Jb.wٷroŶۤ rLHvm}~wږne6mo@=b'A!ik)jj\JJ-DR˴`w1»'.2GWN@o ۵ѕk+f~Ȣ+Wm iԡMMJӇy cKn`E-D~[eXOb3UM1(y*ʍ"6mȰ, Ƹ#aREb"D4FF3#N[sˉwW/s+i'hxT{@~Lbz)R g^ED/?g5;YŴ&}rim)he3TjCX3((dQ@߾~=ooEŲK&w^fOqh'eaXUE!5TiTj Cmګڷ Kd7<_`0uRuU?ySx&/Qg>$?fr) Ӧ^dqz ˳Tx8{KiH?sJk:Ran f\W6Ӯmckq_|XtK_8{z>]Q{o_- 8W)v zؿU%<˫NmR \ٮVu]m9j fΌQ 6ګԅ!K܄v[ʆ.O QrkT UHGҍhm= sMh"(gD][q;-Yq^\o^P&~vXZPi2d8N+oZW_QouM_lg̃QYZ Y.s/"3R*q2iL.whW?@Wd5VUPZeumVoF&i˫ky8XʉȌH\5FݷXWMu5%7>'7c8} q[y :i[3Mn[0}?V/qwNuBgr` רym.s gc0mw«x ~826M8 ^x]` )u?k7jobFט{c48D3Dފg ,>]!Sݦ7`a{꼢!#j顄jo5=Ofӳ5y iY!ZsvT/EW"VCF)'k[~ulp-W&rĢxŋ$@i_Va /1)V̻ w09x-RK{2Nx;а Vj^$/ ߊ 9xWzu?i`/Hmq6Ep+QU JYk]L'[ y~MorXT}ٕeƜբd~ǃIՅ7f|nJ>pz"nKG'3),Q@`v>"Ћ*G~w8 6H=TLK!]u(iFk+?n5Uh#_x-§{#O58x[フܽz$ \ya8z`dgANq x > <՟LFI!L8sy[{=o랷umy+)m&s[{=o랷=o?D욉 Wm6T[{&l랷uۺ- a =oT[{=o랷uۺmy[{-emiy[{=o랷BGGG >Io Dq/\I#%̗;TL[%lOlxp'g:9ړ3[V=@!buSqrh FpHUD Ȩ͞V:d I{ \ ҄HXZ,3!?4}qI>PIH$ؗ^op+a7|@AW~fSe/\ SD+40gUcN[o,tњQ0=Pml=Sǝą"hQn>q6AhX*ލkf TqjTfxQj*.2LcЪ8 l bngVڮSoP`4 _c,𱆅̈́NL(u&b4FpYQ!uK8e\LBFǤ"2R+ J3)_ Ay .Y9sו*8lM5xZ l E[Ǵ@xI=cj) 9e{aĖ}Ils Svp.A3-(- |ƵԬ )}+h)]hDanL4@#Τk- b'!uld\8?iX^xik+@Ql}G'պk+(  (߼Qn aঋKSƸ`u1T0VQ}pa{}?+IHIz#`@ZJr3BS 2$d ] WE-6`M@ZX.*l,D܇ى WxX,y~ _J$Q$pO|30*oF7ov\jhZ547o*Vil˚v/bhnvf{naQzR>|_&UVhINK]={ b~fȽZ4zܙ*D3Y!o@քsiqnG?\I,C+ 2%524` IbZ{@s*Rt׫ s,žvoFWBDx5GJH\d*Xd罡*OɑġidSA6O$wQf9E_%AJUS r aI0T_ioP*nCuy 6`UbҥVpb#Vզ|g2-DCIR7\ i7h՞2#,RFXpmIkqO8m҅gn'bZ@[ߝzK7Rg6mkXSBqtA!ikD:`.tD%!Jm|,Sj~#=)xwO=zڃ-Ы ][p-?RƶEZ0ۯMMJӇy cKn`E-D~[eX7X쮊r.Bcc=~gANq `\|9^x&I;G JqTQܦb:2,"1rb rX+H!M.k;w׵C*ʱg!]9r9Q,m_QcK|4P=l[s?G˿>@RpS@Afd1-fDO\"ͼ-L6 Q@߾~=ooEŲK&w^fOqh'e*KQaXUE!5ؔRiTj Cmګڷ Ktb7x`e| $P`<T/M^n|I.SMǽ0)Fݟ&VW>+* ;w IRth?sJk:Ra1v3~\W6Ӯmckq_|XtK_?z>]Q{o_- 8#)v z[U~0 W/&jE66FzfۼC,7^Y/1^W-$,tVnOp*inJ+q,L.+{fFGuQѣ֖ڂ>_!;7EwwЯ5qyG*rBo*R{ɱf.2/tJÃ&rKUO ccJ{:_!eirk` ɇz:0ȗVGmHɉ>GR'SJ흺usj4\\GfˁwroXGك~3:fuUNjw:'ɰ~uuH֔x信|o *'Tj; oENj],cm洇5]1}1/mܸ&u"u;to1_Nsvaڭŷ9fC5zj¸ ?I)׋:]gf3x94_oBb`o{W1e?aI1ݍs!-㩅&w ӄ==~c:h>af$_޻]| r pWμC:g? ,BXc16)y)%[am;RrF{dI]{M[twJ[g}Բs<ē8CKWQjcܚkɷ[&zgTSEaN>[ d1?Zj,gz9f"ܩ<1o9vdv4o&'e4mb3j?\j7v:n>8/pPì=xvmmqu +*ey6Y;VkuW( ys[?֟Kݚʗ ItΫ/E& w'}WsʾZ̖RS<`߯xպ Cx>k\N>D=Q O!_O={)|vk~ "|u1\Lؽ C:y{bk'?J'KXYg.LH/#$䔵$phu[?‘W.ζyMӥ݂ uiwg'hV.YAaW(% 9ԨSퟓ!$Wn+A2<F]Â_SFlPC<랤;%}(OQ 'N^y~G,w6u}^vߜlMiw˘yQ*Antj/x`R?.WtipIO,%qW 7Y/>>W |jQwke=!'\^k5}P +>bQ?ͯ%(콖ivu~ƘVlӉYudbh#'5xJ'&A[+:SͼnCTn~ ''V'L):cާۑ?Vyg{ycgӚ~tkh+n;hHod'j˗?m2-|9}t'oϓ}d-9\v^3ogbZy\i~]m[߯ާ7"P`-uD#7QE+vv}U1]~xɈ?}CI(wsq㼞jrGX;jd!H5] zĉn% .XlPm, 8Hyͫ,COݙ>i|I+Dw4vn}Qi)Ԩ7VD#JҦKRI[iGiYӫrUrI=[hZ~|ɋo?6\MfKѪe"fK[+ARF⼶l22 Ib7[w|_ *DW-JHN„2fE,,1QŔE"9Ql>ߓ 98#ۚbI$ ՒH0DKK wDjC>yK:JfԭR :KiAӡ̇>\@|5crIJ[׌R8* &e`Xh! &ZK×PA0RV›166:eZI`t$)SՇSº,x3S u% 4Y!)!d*ҡi4O@h) oR!TT2B}|gOUFҚw-"96[_ _B,==VH#&@G&q^^GhMF;Jd Vhv:v}Iy\E4Χ,},!VX(;Kb f R;ЋQuD! p9X.h /o%LE2;=V @nsp1Bv9 ܱ-P~dx:L9HZJF< 6Ϧ~9}Ӽ!|VOI[NXR'fGH?aqx][H9X}thz%;w鐕r:OG&8" mA'$.m=&WKOАv FP+CzG`5ÌS6v$<vګBOǫԤ[ a\vSP7&ȑQ̊ef@S0Es7%c7ٕ:Dl~}ŋ샓U0Zs>r8xwڶ-˶]$ʪQjѸX}+(I% bXw1-hm1hqըFeɊش^,sp6V9]9Fn/u^ɮRꚊR'q# ?G C6*4]#Z6/ύBoy1}٠j`| ow'/.o_|-e} 8u@"bnhWP޼hnE3gg I*w{C} r/lp@~: /ǟNFgaT\*$$Slqפ1 {`HМw5d]{ mu\]_ܺ>MY#5GoIt$A")\9a¨Dv%G)B kCGJ1H݆718]營Bˎ2GۘrO`|oj<8CR֊%ZE|)ha` ֔Z:N/:uwKQۘ@6f`=_dvmV3/Q{JhiNQ gg, ZjK(j-8e= $c$#IH2F1d$c$#IH2F1d$c$#IH2F1d$c$#IH2F1d$c$#IH2F1d$c$#IH2F% 8 ,`HƂdlH2~b$c0Tonwm6R[׀tv6WF=sI0}$2|4b6`+X0ՋֆX{ H潦cwoܝ=?Cǃpk:ۖNsX#zI,R2! FB:XZ*J}AMy<"eim;Y@%X*z?p &΂VÔZТt2 8KǜsO;uSʭ} k$G =sZ/m"As s4JXQ2="Ė]Zm-x@7 @` nUB0 ` !@B0 ` !@B0 ` !@B0 ` !@B0 ` !@B0 ` =]0̃\,BSNr)fTz&R`}w?&G5u?YMN]ɥ36]ݦdۖs}.Akc'FW. gy;R9!N0~Zw, ))g/ne*yһnEBUy.]c 5) S]nzY:AHsԏ gpY -,RXg=EpHg콃NVw_?V{ȧeE^rIȭM)E& S"6d2mIZq }| Z3].jTt6gc)Va<TMW&̲0wi]n GZB}{ *7g4ŏmsurP `3ޛliI|QĬI4VK%!٤ꯡy~{=#۾y]Fj ]غ:o_,\ `2'/F+\1:zUG 8y :P+KVjx;zpYz+lP+:khȔ"|RZC> |@> |@> |@> |@> |@> |xZ$;jINS:j y %#\vWOEIձV17D\D3+ F TKR5}m PqY C;hUf:0{xD03h%)a||\9r>ݴNN(s yˋJyЃd$3<VR:yӜGXj EցHLPNXn_‡L3Mh{M-,'${iC'3r"S~7Nnw>:%}guOYyY݋ai1({V K{@B֐SȠ|Δ`")0[m@O[dTąݬ·/Ndb C7ч|j {q=I$HOCgl1oJAHӖ-ՂdB5zŜ6e/XhW'ĹۉKs#rfɎ1ؗA wpIЁa0s+?O|8z"x}W'/o8(Sف,JFqMZ\Jdb&*a!{–1z[WO0sSBKj5GMsi0q$<pp4%(IH)ELFUZ^hc`2nJS=UQzgC=kcR35u _߇.m L&؂SjUFJ'8 QPH "y,7 ,%@*pD<㟥S4)rb=LqUxLX! 0zQ I#ZZ~:u~À~^-;vQxOj Ʒf˝/]aWz)DaBCDIi2In; k6 88=8m4i *7Df \`O]Իxg5Ǧs hG 'D&ѷONXgEBEU Jުj0sDxgQ'1Om(_oCd[=+)Q$…w%%T%ZytQxb$G+s"b'e@O=XGOE;z!jY;S=anx\ /G|“z8v&j/踴3"q$R< M&DH < j?ۀB iwzCd,H8wW9\~=}?计з m}͛:TrEaw4׋l~Fӻ8?fKBmKs 5ʭ e.reLWgxu -~u=pv;WدC? Gbhn[0`*M>l1jj`kh:5#fMhh Z<|8|r'woSwlLuA^7 wm=K?J+;a{Y㯾[\%nѬoŪe48WZUgOElMZZ[QfɊչIya%O{X-xcphJR+XNJP vZƉ{yCBeG"M%8_)iPA%bV=ca;tkFxzxkjh{f`l .hy%%g'UVLBʃA'q ̓>vyr5xOaP+0;'MgB1p<S!y^#hf6rdFU$5+FO*!fq2ɂl&aID$'Y%5m6>UhxYs~ f*ce2=s7o܋yp0eU_ٛ]D7χ0J. sY=,e'ϻIl݋ħ٤I4E5ݳ]'.K-#z;[nWP1V+&;56> \У=r㬜βQ ?U`m?"1#++ͽHbMCןj?n8]t79ۯd4ݑ&-cti_54NjjqwDN*=ԗ_M}v,??vRߦN$K.M.F w4+c:?V gĥ,NKJ3D:>SIC$_87etgIZq(8 BEIJ8Oq[<ٕ㵟"b) Q&E+F={ϧWYJV 5\ZO#XSZUi]FIZZ )GN7jQrC:$АDZ(٠upCHΔtޕ57r鿂cdY#0Kɖ#,?XD"48h ddb4$Qwf~8,YַYp{!OD.ϢzW.6Ym™/ /s%ַ᾽&-I_U)"qBL*x#9+=]H!*nl QG=#Vww&yg)sZդ_~eK֪UtŮaF5lFeEH^D.j[8 @CYt aAY5ULbDd*? $pf÷!~ȀNhRb8pcg Q†H|tsKYrvڄʦ"sq4a:PVء##Jrw]n.|[-FP\m Ռ ZZ\Uފ ,.\,%rJsTv&Sj|;MUfp;48|'JV%UtcB(p, +Lsȃ>gQ)ex"!DL(Em18(\8Nj"0224C]䳭lecX$tͶܣ}Qi* 2 BDA9'5h@IKV! vt'9:WΏI\ϔL)HҒ)XHڀR"EBL vKkTQg⇪z<}oç7>|'7PW`ZӼB!m7 Y]붺]s fYIk}F>c2 \{aO|o-ny/&j 2۳఑̇Uh"j_nh#*ل7q5 _lԏVn x" !#!hfցPنXLDF R1! Ku$0yEI>aEPQs^R' +F$lZ 1:=$`,\5WuJAuHݥN+:8}0z3bh T[["zkvfEn͟xFֱɤ_\/ISɽ+cܹwe2]㮌qWƸ+c!֐ƈ`$ N&e.g23XQӻ`ii]E"rWUD*" MMLa1k| ,S ^P ("%wZ~C]DAw:Yd"%O"j ^zc~E4ъ%Z} kBp7ohot㜛(5sx`*0?JҬ78,|W:ov^}jЇN oE VėN?qr!*~V+AD_G}:W9DSiѠ<=#?z=kMPSl|O4o_)H#[vkXȹvSYuA`>ë\TT$cF{82Q}6Tv;(WhEۛGV?w d۶B"pO nrQ @q+Cbp$o`IfEʖ ݾiۣ%]:Zn^|l7; ' ML {ZQY<{2ЮYgcL= [ͩp>'١W8/nm†]}b};8)$@1wYn,8`{4goȾ
    ΔRe^c0Sc- Ցiy㛲F;Qi b_ConĆ.l †RRIŝ/p =0"ALjM<(ÁDVHu mm"Pǥ>l5AˬS~=o|nuVB=UZ-# igRi6rښ"uI{ST)Ƅ"$OlG|-9>3OxvGUh& =Hq!?VlNCrZA>M_f_μjv( dzwi*gߑ-CW_)߲ LcP{rCŐ !8P V(6]9*~. %j]FH2-di/T)O C@qZՕ\?>yt%9r+YS<(`D!F*Peu5PF:0Yn?唗Z)Q mNh\bS,o8٠ ۶K5>c$[bmեEo({tDf9*P|YqN!dԂ1 hD1L#AbOsj5-}ߔ[ ˥ Ύ;:5?,% (b"dxQGPd #(_ʙPU:$Yg9R 9I",3LG jYN:Ft(Э-YkiHg%]~Ld~zuݗMKMڢ!qJ A asi2(Eb*RB8dhR^'(J #VI˹n}q;% ,+;0ɥ(0]vx6;A'uzKG++9ꝗF =(b#^ $J  nFEMyh0I߾mޏ+θ3RzmY䱦&c!v"Dwq*XcRKO 2`e*KF,,"8|!*c!֎kZ&>_po2AIϥrS狈} cI yG 0q y-uCG#$ VoE |eQaQfVd}RNFZ'HdCB>t !.b:'x'J](Gʱ-B?3$wrjQ9\⥳+_#sC~(&~>㗭gjԽa NOġ po}lQj$/|-N ElI)֗_W*yW -az?Nj{o\5gu}"]Mv[(WPcmt?^z7=\ꏕW-^]#)ʟ4!|FuWksp_JS(C`RGTRJmɌ6>imw1'#IFjL895c-!-'-ck6>X::t1T)b+t@Hm_~rVk/;^sXewvҾddc*d/VC\JP,T|a1aUuQ<}\IAxw5ipuؐ`cvsv ] Zr2ukp (ClVc1T;:;XZ3ޝ{7M6)% &0Qli%%Y{k^yLtPl?cc>i;gANq \FPf{ᙌF +&o! JqTQ( bjrb `HH!] sllJH~G?i㇗̻᪎Cvߢf(mT^{H#=DTV7<6| o8AMylc;^ Bkƺ<.-)֌*K۬Y:5q>v[+RkUo]^V 8--jJV=vCd!|⌋wmKƕ)uHϒ[TZ?e %xG hK nER7^lMON嶖@TN;ZS١<BGt,&#%TFllJ=WO s 9!!)1?/Ɨ>4S%?q9s?lPK~)+yW5[EU^hkֵj ],K74w_ZG.tYq)丝.s0B<_䯩jmVholϻrԕL쫃 e>o2d:нgbd s.$bԶH e=vҠj}ZqȆϥp]{hag!!d/6.7|1OaX 3K~+&ΛЫ=[.D+2~ \G6'ЫIo6 #}>Z&ZɬOMw1\%2ȗZrV:AffLR0U9EH)";BwV xH+Hcas%%(utvgA5|n}` AHRVa 2LwZ D佖豉hj4BZ"r8Kh>ww8HW76kG}zŀF:8}׏ }T햡Wr=Vc&r#hS26F&uJK>jsu}[ltxn0݄ަV zlyP io~y9w[ [7|OYbu.f|u$k]9q#9"W>zxZQ=VAC= rRW| x^=aR/1",DDꥦ0+Cʘt6 a6O~ZkeS홙y]L6HịUh4BkM̨3hQ9b0XJBDL_+cΜggג㽳%Zo^ئ);S9Zʣ+FxboG g~mPMaaZvRY8cZ>@_Fch&<$ls Wտ`>d&#,-\t0#D_ށ܏aJ}ATMhAPʥD'W D+Nʁ$C؟_^W7Ot] YS4{dѧR1V#y{ *f3]7 7Kۯ+kyW%C'/nQJ;pRxV.uN@ xv%.oT0@%|GsE~)ܕLc fMH F$_AES}dv҂y2MH~g|L%SfǔYq9d6ߎJ)RQ\aq2H| c֓-1:8m a+$U`T+} kbĥrNtSԞnj֖hVsVM'[b}Y˲CDŽp% 3_jPL[%PX a{^0HɅut5#TC1[q^2#Gd8`UKv DU1m@CQ iJJP8/L*hh6(d"I`ci9̀p HȑI2tMe)*5ڲxv%2~,]_xZ {4+d~L%k]6K#I7EzfbCr@}yG/byxqREPV.hi$ XE34^b#`c-iϹ8f_|* UQmH& "j<䚃1K JB!XJ^ 1vDRQ 43e%U|)ޝOn/a=ʀAR^qnR A0E$2`(KQ8Ǡq= deKn!M9p&Gځ@ցvyXgD,(ڈЛZL-7~R'-]͞B,v Lt1pL" ryX (ZN5gg>Z1 Fc2˺i/BOzu,\_6l ]6w n!0*S!D2D0A VVEe!As[v?C:49F"2|l|-æPDF҈. R8*` `2i .x&!$=[=F!r{ Obuf|8)Gb8?_۱+-W:.m 9OHs= Kr%TI|GA_5\ϔ*{m]=%Sy]" t))MI!'BV9Q`nDfN8 \)RuEmb E$C` :Q!^F:- vj6p۩np?6}[l yh͙:Q(D(R2.brc0 2dޙQ0!S&ax1k | \K0V#rtiyB sp Q9@X1 zCO!l@pZjDfFC؄ɗ(.sd)Vq$\@K6h@"RkTKQ25 D[贠ܛ*('߆S1+؂#d5uHf-J  cp H3$yxl!{7dm߆`PDH)PDExcHFsld^ezRQ+#Ϣ x'}?Z!& N޷ S2JuW)-4&H k.(qCvY iI(JY)YoB ^RÂ' %ZW>RLSNqbdF/cL.kzj]O:^LSD}eϚ2궪hd)qZK(Ir yIEt$LEt!"'Urײa iU30SAt |@N@k Ⴑ>rXA,UqXzyxL'5pFBb*BH[XA`IrYZNHెei=%5u[ EZjmnegjQ촥O߆G/?xU 4YZp+bjero>(Uu\}~ 8ApvT> @tw?H=O~SUq<ԶlA0 }%6wF _fu܊L)\ך p;2y!ByPӌOMr.~*N hri4f cčo囹Ap%+ln,;^((+94r˨4<0Km*CA ƙARLܲ5e+GϗeelgF-),A +y #?s0-J)4nEGYiJ#68R a̔<*T#\2K7ʢǁ(8  Ap-MiǕ'͚+h6T}};5`Y܀W&卤*TC/Ј>u9np& Biztb d E[U4Q, 3e[11 hʉ-? ?+/#^Ш+@tpA riA).Z $ؚ%f !-@h A/4IDanBBE(@,5rGgRGk|& gƅoaQyv)L+Qw`;q'!8jܩH5ñmMun+& >gEgRqбMK/E]bH&8αC iHr~K!LݹÕߞ'C_սJ``Utb քT*K +$[ȞB!.A鍱x7ƏC%nkS0,AbhEۙ\]/E[XIu3SOCJ6 9qrHGs5bﲥwǓ_;o&ՅHIpb1WsoR='Q6~g(/.`0xQV;gO#d4؇OiӍ]&^NֿAgdݬuճY&Li$M}9440lĮ:g,wUoTNV3׻to.^xU?}}՛ Lŋ}qs8Lg^2|d% 3/~ԪaT&ɷ6|yY\3Q -ٞ(}tzN`š\: KM{IJzAnW?e_r:`,VϕUiD+_E)K3>R0gyT1~OU:$}x7Ѳ6q|A'l$yAC+ 2%ƠfkdhƧRi=(rSajc{ 5.;ӏZhGK 8Y+\MrVM~#z!Uxd!R&R/A01тFQ4`pHns N}=Sw;kG[Z\ׂ)/k馥'I7ߧ/@V ~SN?AsWj%҂'6SkVwjx[<]wKk(%̖00W,BkȄőbc& \^L5(jB$QmQM=}`{,!iK4]=twSI[qnVQ >qes]_o`G^Sz 3,<*xaE8ɗ(mtףNJ/j>M٠轥l0xFՈ)Pd\*zon7L4ՄƝfUaфvj_վCt0f3m(_5nOG 9N~tQ-w̡aAQTs/+gxI&l_6Mx?*Z҃6F3jF)ԟ9*gx񩩙nZ>XQV~ܷFq5' #C0!SUX-zg՘IDk1hFsƂ7I 7icZ$5'5jxT>s>`.0 (- T)=\x\+N5nz5KACn=-F*q)Ը?ĖF鼹c&d>ZT*5S\60I Ɯ]#Y3ܤX5^f޻a|;v~r?{I @ VG J_B:`^em) Ӂi]IX1[>J/"88f0 TYGb9AW w2G6Hc;.ZhL+ : DH[XA`IrY$fXǚeKi-vHJ;i ;Ql 6/ ժ: 6ͱM;i့ rz '(i.Ɲd?R>NJH&THYP93J٘#d .!8$2RH!DJp/t4'Ck&>0טy/$B*!Nu0\!<Ӟv\0R9d( KS* qm9WWAeT0W/N6+|21)_ԙ-DY}YބSV29/~v 2KּvoqM\'i*P(C0bp:O&d]UY;)ÇMelSiǃ,Y@̇=eHpI,E$jSGk @ ܿ2׻Ae ŮB fieqSB:[ }ܫ? ~P!t,PV>i#d}\  65s2ǃrfا ʏmLk@φ>ec5<$YBETsg1ժgpI~)pYRgiOR?AfY@ |3<ϛ|xJ*1S$Vue}+FNz3+Cn F9RQ\1W8d=j)<~97~./OG}}hJmB(a!XFe dnR(7 4QWPkg3Q0!ȃ$c*5(K):$&zqG^Ύefx&z!pwkٕXkXG'[냊?.AQ7#jHNOui!xT,K#!yhNy#J"xYx19:pJ1-`&_u8UZXarvԃHwD@4:&(fEwZI%PD0@AP(@2ARX}@(@aP)_|ĺ[A"\(6>`hМ/gQQPT{:: [\MsxArJpN1R`>h)`L`kPl\KHRUo`oe`JP"07F!"r fN (cz^S,|N|yq1f2b>6.|;,?tL]; WN`ƍ2yͳw 6qdp6 L AKA87/@wي!iDv) `5:/0uf<Bؑ>>H0ֻ/@CZ"'CКgzv9L)D4#fRdryuJ"E͔mU oh +A4[ m~8OaFg UyKޭ1ե{Ut ,]u?Woޔ/NHI5w|EyE[]rO$^a!^eS3!Y^,} KLTLZ|8zhٟlA[ զfqf<|:T50:Ǹ7k{և/*&~*=׿:_?1??a^o'UeF/khapF߸Y(WDGeJpfkdhƧ{rQ30yE>S/3f E <ˋ—OupL>\Ǽ29%܃R%j'K@{}T3!üw̛ykxwNє E.7- >63+Qli֬}'FIڳoy>(ROX,t  V0Igt~ F}Jm) NaI Yq́JL^CG#lJahY4a#A2K wE%N| *#eftW tv%&(/!kX>wuBG#2>8/e\Ԩ;t't5b5tjb<3pT'}Kc\g)J1RyKrZ:F CI$Vl\Xl.f!$6A0ɵ*oxå 3cĜE NVF3Mٷ1vWn/7|^RKS\/aЛ@3N܀>n;/O{wi꫹uy02hN}O3Rv˿3kx1%w3kIrLHvm}~'ZC7̹uKz|@T6@lo A@= v,R Ip TI]jaW*I-Nx`υ-2|ɛæ3DF҈& R8*` 0^@L Buranxu\# y,wE'tEQu6?5d8:.KN]*'b^j9% aϯVi:+8]U8Q).8 lD5ej<[Xu{[66uZ%nd%/7KH?huaMSPp4WMz*oi*E:a +>h3Ѧr~XZA,Y<u`|3K_L}0zlinw> ਥ {38Woe&TuH6&n(K_(_2W!O9$rxiJ%_z)H;[$'u?Z\j?I`=,sMpTT7! V5pг!rKJ |~m@z 4ϼ;O ܦޭ=-yM|J9f󥷲ޤQ\U=m)ڟUHL~vzx[d&PڳVpГZ5mh+\KvNLC@S_yw{H/E5moExrE;Wf/Ȉm d.d:kv;+[w[_ ,[&vg,3;'Jvڙ.*is"V 67^"""}Rӝz,(GJ&Eu' 3Sq+'` dap K|O=_lv㩓gOtW!gU4sVFfN8 <)RuH"sTfbWCʔS/Tj4vCP q쀜)&8 d%[w6YqqJv5軛%dx]ZsSc z(9BT*Q%e\DŜadVSXG"\JlB fQc(,<RjYDW H)χS8q 9h"8ư6 8-h "b ( ѐlBK+3ڄPgy5g+Vq$\@Ke 0'Ԛ)#EDM7{|#=^>ʅQocVD'kF2jꈑZ5*A '18$ѠgHp3t@::^k?&GkB[)pgPnOT$Scq,.Wx-9ys.bcgmI EJRhY/1@Nu9 GWGuT;xƌ?L#~5٭wC`{;u"}ziKr)]8L/ʞ Dg DzgQq;sWūie}W[oA2d{,$*޶PV[E9YnqZ)jii]Uo_L?f5{ۻ*_w&`\ Sh^/9^rݓ r܃ػNG^ Gx9leƝk05?~ry ~cc28Ny+  )X0APc;bP| ŕT|1 $|{g@9!xZ`T9F$㑅H>HԔ1тFQ4`pH\crr5_r9mgmNΫ2K4ϯL-q1e{FW!ݏꗁn &`o.$|&%YEY([v LRbrY#TcFxI.Rg:EKbLx- AĶku3gKM-~1p5C.D1$Ncm9ewk!ck L G^\0y>%Wہ}Ϲٯws,U.sгWpf%/MMC1pctLvܹ]}=lG;.3gntzWKE~3,Nn:9H >P1@#38(JIaJQ1xΒj}+ydU@;H&eۭ[%Zxo~Y e;\#`vtYw WO,XPt,Na+;SrBPIĒRJP_yLD.އdz\ b+x'rTg.+ǟzKu7] I+y5PnϝoԜ2IG%k2>9~4 2O},\k!,Vh7^/ [bRBrhaQuH2Bd71$R*&VdN<_\fk %Ic?-~ J{I,'ȝ\>wqKpb&hhF&JVNtVIXRYﴠ0)G"ED[B IFe[)s+X'c/HƌHPNM򠀥,H.8!(Q{p劕gC2$N)iZ -6Ny!g@`oD+BPe]j~}^{JAT,gNiO<}.w3x@MG\~}{]fl׵A\ ~ќl54.- 5?Cq@EaRiu<aRF:0NHL(Em18(8Nj"0  ͸r}k ϯ]KPx7;RvzeJzDZA[50[?Ćfav9'7;+4[ܴM2U\goJ` QX ^X"yq;d! xq^_Q:s0qI8)~K'~ Rwv^-77s$*OWws;On?^! #Ƒ@iM5n*'Q߬7Owyݯp~xsۏ;}ÇSy{8 LPgpD%;nooкq\C3SO'|qGnh 7Vlmp%@,~f}N6~;!W~ӍDWV׃v{$nMfU]nn c^H3`4ń+q; _lֲ^$߸ !#!hfցPنXLDF 9}cB@ GVIzhL畕qا."D+H'Nk >9D{͕@R@CʢNSYoc^} Rlf`ͫ5<Хi4^5#O4]QnLdZyдWuKO$ 9_uc#vïqk{T$TL2 .w!o=j-S=Z~6zt$+s7̖S4"tVdm] b!Achu>;?Jl\RhXb R1ĉF'#C\dG4h$GN5 VɹN +k^geoʪ6׮ⱓݽW%" ':L4*ȂrʜY@@QzCT@F7 !(*:mEx%O 5l_J~c?Vn|g(t ܿYpl(K67 Eŗ3! )#Tl8}Cdj<\$1%s ?iݩ \X 7@3,=A*Tqj(|tZ=\GnS <ɽuUZUC?!ՂJz m]@'EZ|NwjbrSӽyRhWE/8_4V D.j[: @ɳPHr2RJN8S&:s*%킒J{r#c=d r{`Gzյm]} V()5Ao9XC#w&@()Q /ɍъp7MiKu9 %r?TZ"orAgmBUzQQ,;oȺwˠ!_#g3RqK?[<1٥x3kIK_/yî d Yz!f; tzOɫ3.P踴}8*?-ຠodɴ%XJg+IŸNztn\ǎ~%hefXYDuB`%K4*!CcM8\Xύ$!\M*$ <#:j,s 3%ȹ;)񍏴mG ^(v牠%d毠u'?(~~8VN᩟WWׯׯwv*TOb8ލ`?~ yFq7rc?~Q; F+EE$<pѣ䬲kKWI^e @٠hKu:tw5_[xIO_fs-k +$v/ .߮.>`h n1|"ǐku1dZ*C$"H_2  ̵7KC@qZ%2k铇V~N梨(sQ!ҐL hE#'T٨*2%׉Bܧv+u(E֐ drѸ!0XCrX8mە:"gfѨCΌlhڗ^[CM9LeölUW !aFh%dL#A"iG*sQ$f;< 8kLn$9Q¡)&L'u M0y0kU,I+ƙ4"#GA9&a a:*eP{ϒ@t1*-Cn oYZ#gK9;lj"POJ Cڢ!qJ A a EHLEJMJdE#!u"Ъ|~)?U[| $! xLrhYSҽMxC]Jz.shAY 1OVvⱋxX釗ׇ"qCC,[ ^GuscuFxIY]RN/y + Of ]%TZO6rn7o)%MJJK)9Jy(%u)) !d5*C($7,*q#"  `qIC_mŲv+vK^Z5# j],V"UhƢW܃I ')o"""|,C48>PBI=% F-S.4M'r /. ɊD\3^bu$"eR%#֢Qu -)r>"e2^ \J T 놊~jp:@Iu%&3itre;7wW[P3(ڠ+ϵs$aF'kl@TWMut̩5u>sUbU9 -E~2VDž{/W+;ߓ^oXJ{HDJ˧ cPyB3ոO`CLUADS\*B2PЧ=91Y }vΉMse 󐃔B -)41JFj*JPKlEW%ԹW9WcuO,P{;z+BQ| &Hy* >uGnaJ$&B30ol5/=e/^&ISB4"[f17i 3^,;^ݻLo>7^,%]J|` w{}oWH1jغsi{dxU7ݷr[VزZHknAwww($СV0L|=n]d~2NB97_qd\5^k[KVi0.OߵyLp/kb?_X좷=7\ ͭ|a |a~`JTܛTC=9V٨}eyBͭ ~"]*:T|5sW/ 9>…╘`7^0-d1qj\6 e,Z#5(OJ;VC-d'#W\񯧳hͺuPُ6-Tb峿-&L8 * R&XH.Rg&EKfLz-*AĖ6=i}ճGpZ/l3$'r(.j&A ^z)W3 Ͻ*R)kګ:(d2/\i*I+ ֙L-;{s7W/\I:cD'f!PD;:~׾~ՒZ\W+ORRU'Z0X&%-7?Bv3h<;χ5{> g/_edj]O9RgeNIJ+\ֲ Qs&?`MtV=11jj:G5_{/3~G~VnW jI֎0|6 NVQ˵XpE+T%"37+!WTj>&,t?"}ٗWz'KG%̞RdIڭ+nيS~r*ߢcGfD l~ a v 6 {勖!2vy;rMm}AP u9c&װKbBe* bFf;cԬbHd7u.ld_VT OrAd t4ծ7-vF:!zB`](ئ&W]wc&} ]}<=u%JfpUumOIsqz*eD9>c&ꃝvԽV/K${$=f;%4$Rq[PC)x&6c!0ߚ./:Po<ܵYdKJ`pjL,=k`2,VdB`䍾N' Ln+o!þrUm|,W=Y>rs{t)o\?p A{.yYvrjslk1ܾo) zArPMإۄbÃmB?4w6a \o!ܵvl,k5F2zCCdk0'LǸ&IGA0կw..68f%kv0?N_CK0_\N͛>ffek#f\rcfY3iw6r;?SeF~jz)\s2{r/ѓc3d`bU&r62ZTK4WÃ+uyDd:ß_,lZ 5S^ӟ)kWͼ'[OBFA ` g]i%MLhgAG)UFtGcQ;r ((WOVkgPIQ'=m:Gpq{QFjN-ՊQhF$7tTrO|Z[{Jj){+J2B+0ΌJDB ށ A{N]&IS %g'2Z|2t64@ʎZ( W=q]x4K%%K#}|ſ.HtNNOU`kiC̈`\>Rg-GJq}tHpEME- m"D/|2*tTȬ3nds.vן-u効Gprz[7Fjwhؤѡ yL ~~UMSq{[2Z }[J)͛$ESߥ_W+x3oYh[r/7=@cK#Նpz% Qg+hiG`Ōu" 'N@X , =򕃽!!P~o%Zɝ)(\' @KJT:YEJXr00Υl $ItIPDRBHψsN/C W.F*98(%O#ߝvhy4$W (D@_CYi@Z*JAxC> Us{,a@Ƞ|Ŕ`"T<) Zn8Y,#SgW8OEf,Zja2ւ44$T 9FCbV'pM",l(X#iAlBjxB1͢J 4- ֊ezFmaM: e 2C_ҭ{W5uh{ǼbKhwڍ^*wUGR4] dLGЖhfPQU=` R R&zNJhIA^sin!']GƒpʼnDTM3U|42\2%\ʋRΤfE̒df9PHЎ)MX Z#W޳$ &QnֳbtԳZ8'5ӿ|(2&1t 2S *#͗\H "#DHIy P\qH UOE~jTd !W ;0ŕ:H3aN8\8βCHBCbPߎz?V{́+9W`lRgBCDԩ@0Yٴ'Vpfccͺ@GȺX69ô niG +DFrg +cL:$TQ@ VFaRkzm+#}ES=?+&"*(|˓r䁷D[eA+ϕvSNjVhkE1 (')QN=V`sюbrv!_&y^X]x}GؤT&wu{/.Lv nvGlv:.Rz HM(؄ xIݔl2YzBRpIh`Fʒq88C7I)pڐvF!)IYB$;hyd4Tx~Zb4c+Z$U`ۓ|u= 1DRtyɂ6F bƩ~1gGL2/-{NN[RɆdE".HbuIL:p2 IkQ:>PsiԚ)c$ Ym^7Q$h.h@Di9ue5|4Ed8 =:!Ԣt,FKMf&vn"ė$?kmH_ed~?] 8bICQݯz|CDc5ǯچPcGdTmtK*"fJElH%1QMjS'UyJf:!ɘIlAZ :b$Fm z) cpH"yϐ~ip0kx m-Ark- Mky4`\ԱMԒ/BC1,er Z+ <{Ki!oȗ07à ׆k;ِmmS#˶bJ<@?ŭJ/$\$n&{LP#gA(dc h=$ZSX S kQ=}!-lͺM3 j_^0Bif 4fс ,ɹ1! HU {,B2r wN @c Xhm H D`ۨQ1,%! j@C}'ꍜg}rv l~hky}ʕl+u^?{p(Azy1V2gR˵rSDD}9r_6@$.QiYT@.eKY%4RX^4+e/k׻XG~e+Kۑ-JcXm?meww\Aڻjr@! -k.(|8}]`XZʩR ˽<>Ռ5_CG2vrMo&Gk$^H1 N9ť|93YR ]k%kvkZjT7;׹7O~ 7WWWb6x >0ZBI TH'Kґ0хT}5p$,r!c0)NE Il-!\0Gd1}H؃%AdLT=I23uS/IN Y$fXEAI$mvQ,yYp agv)x)YCr;7HD3 6r'Æ> } Į.eInWZN?ʯ&Iϖ2bHt.|6ĔG0XGݗz>W#p 7̚v#o.Ge$ʹ+x78<fI r3A ,Rg2D'( "k/%%D )XZYh6!7. -EJтS/ly]f{p߃dnA0|=\ mWA(^X(eU]Wf']D=ʤ Gv7M>DdmqIFuA[[49_*f׿ufx -S^완|,ںd{~҆fI LRLP"AMxSHLQI6+s1kţy_d}}=}38j;hX@PT18Ox{Z8am+/#o!脕h>ѯ!>xr=| {PCkEkm^&S&%a77]",_z›707?pa*Fpp=?`! ivsJr; #Wߟyj_jhgQ)тJ0W %eQ,)%\YiVșտ2YN:6.Qp! ԄpoMWVVw5i ;{2`JPkse?Uo-nVu~V< мn֤?=w?UԤ'R ڮY*r7fԛ4_=_1ɜy=b'{3Uά 5kHE/cA}! < I"<}yN8|[HOxX|OjdT8 ٍ)H"KomW_Ng7˗r$RTLOO\V0dGcD Ѧ_`'.R:JF}ߧ~}?|D}z_>}x-X @gFF&AG ku>.>eZ[:#7Q1oQR&sRڨN (v Y|kTiDBk_E>bGCW@mMw޺ӿVhM<.<t#ʝ$"@9":`,Sk 0[#C@4>ՉӴHQjp^zyy)>nȩ҃/"B)"qacjRrJ>%G[aSMXǰokcz㉕yi߭uO3l9rkCp j|pK%nMW{C f OAs?$9RȓB2PM2~ݺ"WZH"R"EZiti!81@( .PX0io<^0 ttrXQ/O{ IUy:.PHth՞2#,RFXp4G'j^cTne]^l/ Y Za͠]&Pn@dSxy}޶W_/ǓDۑC{\H/Ss.% m 2JHv cQbePuGT5ORA6g\ES^Sd${|.~@W4c/P-3-/Fc SB QQ+R@k4[d"(1(C0!B*CH zg՘IDk1 Jl5ͭ{#csiO=5a6/O;W4Tθ_,ֲrC;W l;O+}u躶ݻ>h:;ίxI.1 [| 0MZ:ZX[|.wmzhGa祖a:l鯹AxY 'B2H̽Z$)0j44I# ǘˆ䳙Hv{vm5{)Fe^]{DC0 \jo1-'*! g19iT) %{K-wF u>Cdi)NpMKfb13T NfL6yA&S9ŁR@3ϲ/Ўqȉ[(@ qdEV6%cdн=7q+hWJ)Qh0$Nh YԂ,03&x.g6hzk~OS3,X.z`\ :(3IO9Ƞx\\- q'TEZB>4a> E stdR9(,;HdC֏تZȽ͉WR y iEDH*,UJ8 l|)rAxØ3@"B*=鶀)PA[3g:Qhڄ\ %0EBHd[ m<gtql:i ׿DMͣW/8%d0$z&ccgfKM%:Pw4^Hzt:vM&oL:vüN/4!QtC.kj/ݙMUy5?[E_;x*}zᗪ ^?j5$i=|u*՜[SѬ"b,(ՙ tC[FcޖYZMUq\ՄB*gR;DY!tOSSZ=~5Isϕz KZk4f,71yu܍=mouIrYo2?K!Zӄ `hx01 ]pȠ~gMR,\ >3N00l%x-&/c@1B*i-A`(Ǟ#Ĺۿ__ucr',po// LGx mV7 1UYrAZ!EL̎5IQs-= |&}Xbt]WdggelMa 0SuA폎N~;Mghέ@gl!)^ڠ!3)34z`aYi1X,&cKMbbEdk tUdH>y2o {Es63@jR pY@rR-$472YDrgE!l5P>0fy@ڀiDڋF@3tu[@os-'قyOқC_*r,dyW3W+] HِBTI'!;ͱpۜ{WKQ}tKQ+ ̾9Xԑ[c5&s^v2Aފ "Ȩ5H )KnCL>LY̅ƒHmd>3[z|šk0t6owsӯ=riI X{~__tm̒.p3f+q d1F0)ȣ`HpEc i#()P \A(E=>pS UwnRuh|>%? @#u\5Т7_aDt:kg|^w.= d|\3kz1&|0w^}bJo^&oyn񿇐=Y>t\tz ~MAm[Y4uץW[lf!6m"vߔZ300S3[7Hԙַ_\j|NFĮ] ԥsx P}Kb& KVʴ&M,_w6UZ2E>zՍ܆ѺW6P^5v6ޞNzXp݅O9*x8 A E*8 $'Q3sͧWmn M[7ޜs)oSTgJ 919'cOTGU0Iu+-)g@Eܛ|MD_&}XQO@<㻠dd,E.eIRrIX ;ѱ/F!.6+fuacrEF^d6z#myqYfG 4MrV1D噊tR Z.ўSh2KN((N%3NrΚBZx*0oBe/i\$"žԙ;ME&BE\VXzїvlۥEj|2~6k&bp 0''Ptmvc4Fb'㣋@O/f7ݜYs&^Knl,9FJNJϟO&X.M:UQW{|W'?梖.iNw{Ѹ=]O?.z{ݏdqD#0#:8{>Nxtѵko޵pK-X`66g8?Xr .滝OpU8Bb̏H9v"c]*0< sو.zـ䄷5N}vlZ/nË7Kǿ )B&Wh&V+v * f|,8#9r[;IO0yu?qa󪲒!H"=LV3`-B!=襔Y()<M^6u*9|*!ͧ fnl=k6iK5~ՐoiLw^|;<`g9,9UA/UY]t0jnEٻ6r$W}ډj㑉|C^sو9 5E )moIɔ, \VT*_8*-W󶑺4˿xΒn y?{T:.A\Ay lMV00ݺnJN|!u=݊>z@*@(!Z;\eLܪv!{UJ%=1[K丂vwqgƮO?>K̇o71jk f1*‚E[m8$+_ wͅlY_tI*BΤq.»t6_ [f_&Aluek_D]Gsҳ77-o_%uʅ":u~nvrrMN_Xۯx~׻kj=[o^TC],m!\閥W™-mmaGw ~J7/%w-E!+9E,U ZD!\vkNFx]Fc:l{q#{[v1KZ1w~4#: V>ۑu/d]tK|beI RlGsC@pBoBp@w2o~8' ZcO~kTR~7?!udU#7+5jTQ]*/ ԕlVxU#ԩFm8BUnTWE]գ|gvu3ܢ=%+xg8s&q_EVٳ?+sz14j_AA3ox8O?6`FUFO$WU>ZH%أRӨWӋkgI+SQWsréFcWWJ ՕA2;?]2纰`uX0r* hjXy.K(~KӅ7 #^ b )1a:P[K9=(̈eX|Yl YKR-Dn]l՜"yNNIEJtc"{NȓQ$̼txBst임PI+{'y^O-mNyXpx. Ɓ@CZ+!g FS \6VGsحV!=s;phy'V4}(Zzgf7Z[dy&0kCao=Ub|L$_A˭FTڷɵyBE.WYkK9(!SĽ ё ?)nҋ͑MS ץؗ@lJ=ZoQwG/yn*ה=mFIm,+eq*ycrU8GV%Y) $:%ZgOG9{S`= rV*W ,[S%1PVtjdqAia=GT0BPd*zPR;bc9[ףzz=¨rX:RQL l.gVBCV| YqPG*_ /z/ө$nAU\Y$88u@LY޳2 ,f[f At}1It屶5u䂏5!{p HlTr*{UkvYY 6;+>%5h}Qz1h NT i+jhq:F)Qm!z#/zv,׍Ev>,|72))VZhDtE\uĜ)l|% OccC\}=!Jv7A?͚aM Ե 2̇ !UH:97 ceG*zZelN̦2|z1 -N8 ::GaP,cHJ I1GbƐQHF=SZF5$B0r)D^y2$gS1dR|Tm[YjACx((V5Bd9Y?&{%v#g3NMf<}]QǴj_1j#`e*ΘH)U1F# J ɢ7Hc)-5o;$x` )O׋Ͳr{k!23ʼnZ[cJGHu HeTr@Bs!4`ƆR2?@~TtnG#`zsb}N ttΉ}seI) eMlBզhFP: YTUTc.DCw41ݷl}*t7?@'ѻe5 Tz:#<tQwk}rxi/ 5xVtV"e#ˊVACP56_7(_##t9R̙7Nu /Xju(%3h9EfP*@Uۚuٵ9[/t1K#~oB9skn9aQۛߛ#%y3ۆ.HXT=P,W D0B-g1V:wrLt jzyIďӞOvaۡهV.Oy/ƬD ^vsF*D>bDTUIfu&;9*UO͸ +{Q]Qb05+'kLFT0,W }*n|c>•\WcёkgYPMbS- *٪?Z.DM*WAnm|<՜+\|H cɾVTFb%[3v#g*PJף0gw?< ~xxq1rq>~ZKFδFDXӴuL&d(P JU|?I;-xM8Ut{Bԭ3dDrQəIJʅȹb<[f}!a?l26+[.]{oF*Yaia`pdwA&n30i+%G=_5Iɲ,ʲDY -bݿBb%#1 )DXҠg$X2ְIZ'iRҎZv -+=%#;@ UС~ӂ!wJ8|GwߝdUR7'%xW^j6PgWϙ.? R< n'A"fˌ  c vOBJL6 0<} MF%P ){ge(A4=ӏxt𭼑ړɌtz|EΤ7P1PL}07kpV$n<َ9N|!b`u:$2MvJtJst7;s݃?K+V)霓sSnՔIaҾQL>|J/.%qz?ErΌbbԸ/3\l!PVS}8b$lg\7?jZ3ۆAɴ7Wjʹ/{e6=+~SR! ̜.䦓,YydcOPݯ.:<,N2Hf'`O0"NB䪷g╅beoX&J^~lXz]ՆI9^dRvyI'%kۆK2]7zܨ3swZx6^bwɻ+jfxS9+&.1 ~&-gN`eDPp$@)"a֪Dq oc¸y5Lx1*DDI}m}37t qnnawI k߀\vboB =F6/cF#v2NEǭv1ƴo5ϞYEJ2xؓѴ6 BN<^wwsphj?$a{g^e2jo'{3Uά 5IE(gc3~@t};m9 *C_>ΤG;ͧCv03/(<0.% e@Y7 4Q/V]p$\o+ Qw9VlS-zuYjYZhnκ\'FR MM=Zq7{IҨS2) umҎ.(i)YJ9#R"%1 r2JaQi]:‹%HD0 AP$@2||ĽTKJC0TUVo~º[A"x)x*h͙zEw 9cq?tiZGs Svp.A3-(EKcX3c:AjW6Y}+hS $1 ,bxtPp-/s_Dٍq?βI[o493H[gufw{׸,YFpyͳuv?C&8DuL6_ "T*>VVÙ| D-iH=S H S%9AdSwj"\i1x}:M+aUiC(XRy.*,'7[0}w^/aci2~u?zW>1)[d鍼UWj7)J"E]|F+rGcT ã`'.SzU%ؑGTΪi,~/o\;fOj:#|qY-s;H GI/]lra!0q$Ɨt4 iFa#HQ`,&+>]<ٿOV=l~8V:}ȦQ5 gic!Le$ ́}65(lzgj?t^ѩGYX\/w߽>?1Qof`:+&0ښ_#mA Vm M14m>u=|qba2 .Oه/o~~Y YeL i+ =+@6uU_dӚd%U[J# QSMmGrmVߴ&}ډ{Ɏ-72m IEp(L 1ĄŴ9̩0JuAk6Lu^^ϟpสȩC/"B)"qacjRrJ1%G[nSMSmOUQ%MvjeC5#/;ʍN%~U_~rVE◖ҫ?H"g[<-` Rbr)hԡK[k'bld1*2[?fW$RC#ځ'*j- iBpcqBQ \X[N q≒)x5yh|X\_t4ND `TDuaphHjQP:dNx[{dwzӔ P '.曼oZ|jgzeX¿=7L1-wL0{rln]$QA6 xn23`8]Q/ D <j ;  C"e>RaJ#68R$#YRz\TkމXe[Z#gD +qLiH{iPk:Gc:s4cy8vў#J;3QaQ][ph:W/ΐy*,:`ucʠ:/i/ יldMil-HPV$ת< B($|:c4j|saJ_{i#3r.{2dO_jyZO?_a6vf~C}eIw7z^}n1>vnr elTz7&J>T+l^Kd0֗iZXk 3o9Ɍ<6a;WoO$?0|&2ߓKpOo %bkeWrCII$5ys\yo>~;9?@>~uV܎vלߟn;j%-Sod;*5yO䢤L ZVYYi3sv_U_k.^\Hn˂I[&u`sʶ&y6-:pC_Z=0|U !ܟ v~\:ʁҪPDi6֕R surP6\xPjʤ휞ݤ'gӓoDNӓ׵x"BR18Rx'ӗj]Vφ,]ʞ֯<ʜ$!aR/ʱ, be}0z.b1h#T16-{_]qEU?%Mahڥ^X<w3:䦫Zt̤Z㒳ۗl2oWM;\"`v?\; mޞWōa(yaj`W,^zImm i_@dKֶ dn #w&;$-ier[FYs ׋Z/^_a4/Y܀^j8a<'CLG kZ-zg՘ ^ˈiDk45[!-WQ/l6m ^`H D tXV[6Z9CjeICѵ6!! ]QQq!fAn$*Ű`K _i"y p2dW8ս?л5>9{8ߗjvWn[ <9)wl峛jeO@1eEާ8DZybH2gb шHbRBc'@މ H i-e B}B))!Yp).}U`̙֌flKi5k'36f:_mq_ ɯf0}WZIݣhd@a g'ZK(IZ yIt$LEt!"'UqQ-6$,r! |g!tbaA0 ('…ԋ{G Ҙ6$ɒ_'c}E2FHHhbKFb$S, A aKUڐEN:I$mvx]KlTuKE(j{D*${0e̟%{̫7ۢH D{3Uά 5IEUp،"%ƂXlDN l?]eF0v؄ff%Հh`Gƹdn3EQqAZ( Ҏ+O(ju# (WKM?{WFn]/Crh x'wm] ںؒϒ8ݟ$˶R-YrpF(K/ң :Rc>s,B)aȂϷ(.J"J3?bqm"ƛ@r_eY.H.35StR$h"ȕEBPHp:JA9$4U8:+<2R N%suّTh87$ɑ $-U'ߩ Vk"=Hû֍kIТhuiy$U)AZJ INQr> =?jDB"j^I F(TT±@Lj)%1ɐYV[AM&Lh@p2I+iPϹ)W 6cى O{?͟{T <*=M.G[zR5mLVQ "i8{o+%G)L)kegP;aA(6^ !Y|_#p$C8Ksv aM&L#[͞E͒.0K%PU k r9M0~491i8pQT{v?7Šfջdwa(f6jECjvP v~?#=qXNO/`Ty7;ZH]u?7O^Mg^f׉1FZ }ఙ[i/vI4G?Mʎh1Gik $jHmÈaifYޑ4`ŢǣDO.7i9numk.F42"4iclW|F)2~Cxצy<ÓCb?w߿w/o_}n ߽yE4SڳRi#)9t'~p ?~4|hЂwֳ d\[ƽ&>bژH>PO9pКd *(ޞ}&1?n'ƚ\2i/6p-r3Y#퀡> }-hβBgn$EX(l6y@,'m.T̸X`m丒!FC{{^Uϗm8l&H k@!2BNJ JȦT,rOtrUy:[;nu!G;PY(bsj~<~ D5K_z%L;_Z(+?>~J)+ lApL-Y8+"6N!uЫ qʪx;҄x~BkD1t dEi28)C eB IrnafH{x^nm5wtn9_d.nO{'pL9p!*T$B,j@KM$1i]d L A^x;zdwvgoS;Z./7ylz3xjϬDYsGotܛ}3w`^Rk R܁#S W!8TkP48kƁRI3oT&2E+pᤧV%a$q;'9 >e61x>[dbiQ!V:I8q]A+7.WiS򝗆|u{9^i5u|C Nphv˸U9БͪoRHFASƲc!vYsA‹/˩o7n,T nwߎ?T2pO?|w7j2 wn\)EMf(+ | rVK,pЋ[ M(7TO>g$t2koF5?áe H{lx/AOu?Shl{4\?dj"m΀qp>>;]ZܬF;wXzP.x{|<5Şk4 girvQoSWlcN]Ϳ "4` ?Kof5 [pW賭L2D]!E8pY5]i7r '~ IWkvi S&D|Rw9B?ܼMp&cw/VL$p+p=~qyjVGn-c<{s0:OGi_5-T<@lu?M*'(i|!Wd›.&[d:E+c&Ew*~S()Eb}2:Ru,יIz!&霹,#e\+i-WVkǥYZru]9L gG> ʽ>`J튊DP3:og0rC}ݫrmK߬ͯQKP`ζuQ szMw)jIs?ηbh; YWw=nK! ͦmIe-`YdZmR9Ba. X] DYc΋e<C C EH"\1K)]Qz':-:ɮ2qFHJ@H\֥ȲR\ \g@Y!  Yxuڦ[&ai}_gLнDrB}2I/?ҙj%Cǁ>ikٽA8t#a,"ʹi9ZE[RURJOu1W%VՑG@ߔg2#-[Sq[$W*W }nStNKRdue#GBZΑVX|LquQXez4֨xJXNx'gc,R*!m-/yI8i+imv*HAX=r$n#ˤ0zP:##9LcuS=TƠTב+ 7t dȸӋl mh̆Z+B9emҀT$£0f 4ٹlE.HꠊCRykjjwOGm1s0Fr[3;)Gڅ5s<{bUeldd,{vO ԻH *g9uQ3FCUIT% FAEvp*YN /Q<I<HҿJ>_+(?=;-TIqeMI\nÖjw;}[.bҨQEqXwi`rFbObO2V!UȵT*cW*TJWzA\f3;HhPX.ǩɅ)QEHXAcNB.3S'vG}ٺOܢO(n!(-WYJ'ZgzZEmKqʂNFV$rc}2N]]ڡ){n= yKu>ku;alͪc$;6~Fw_6TSE5&>᭧B~KBRzm(jTb2 Fo{LFrɝd< c0޽+хCԺ &;p5(F {QdBdT`ӱw5ޓr5ڭp5JɡY>pEekmH /H/s pX;pOgTDʯW=|K"%;g8]S_UWWqa-Qe pEj#"*eƔ3>&+̸U!'Eb& ``[5FΚKBEn$JeӔӮ'[m9߷j!]?LY ;<#cnfE:gA,)s)+$"QVcdjk ,cZ\DZ2Og19A GB]5r[q3mM%5מ|u]W_3[,̨ie:> R®t=m\c*;ܱ#'%6;n#3mY"'-[TP)Y-nZ cOH]y2ȵ*"*T.+gGBԯ7r_90 BM fMG{A,qtK,;4z$-XqzD- 壣9b_+OӸxJJQYOKO [zKS 4Ȣ1 ydQ9g1)rytRd! b27u=Iu! Nv$C)SӲN< uŠuግR۰,GTz~TGO ޟ^-I4]rw&y" _{Hj4}Gi V/?|_6a`i2ĻYM^j+#="qQ`boNL.\D贌95t+N47)L7ؿߦ%30֧ڸ-| P:*b8Xr QoDzF]*oñb vw>@̹WF:V%*s}@u2[e\R[މ^jlG1{tBEYV=Baȕ{nF9 +B6P.BWJK[l{2ꪐTUVCWWJ#:u+Pܬ^,@Yο__)s>0k :Y+^ ;-)+?+Z=R ;_ڃE|0_) X@1,SŘU)V顲&(Q҉/"?eZ@:Adfq[eYP^khπZrVO/a%0=Z D@DF;72/m[7Ok5_[?Q*r%']u㓯T ~qY̘B!jeRAN,C6d4IEHz`Jn<Y[ _96ܹȣH M^;d4W Yܜdi}A4Wza)׹1(ɮG &p`AG9##$])9č@I:@%ean"T僐KKq - o,A8gB)K -ђi ekyzԱq 7'+2KM?l8P%rV0EBcLZ#!\2E0Ac=ZbA|*]ȭfg:k.^R@m@>a02XBZLD$GE4qcPŭOMwº$kWH"L`4f22oL\B0-}ٳ̓8t:lbKmLi5X 4a.yvFY<̰ms-;ʍeM Wu6AJ';^~9mS//T rǿq /{ 9?Dr}U[*$_J1~JVv ]nK0]Rۃ2E2VNY]KRX3ϨrQl &[=A&˾U"nm4`B:erbH[fւ+ ]ϴgS g޲W9{u/ SCST횭f' Z7n>j#|%:H81.#] & ƽS/ @w"I "$X 0Fd'|Ko::^~|;/rO8S-n ܊b'śmO8F0iawͼW*i|bC%qnɄ+ 98ɸ+h$Y;spࠈڡB`F$^uI?KpG"}@6uK;'a4 h,̱,|R`C6*ZI$&-cY5L%4Ĩz,6YNlhƗ0md2~>ϏVk;t'tĭzg!UF%mخ:mޛr=Ŷi7攞B2jiM}):2b9:,Kۈ1L$f Z`J k9n܀)f%% }[R̼Bv9Xkyz ! f~&S/s~){=/OACNH&m+[PP&|g7ȱ eQ 5YW+[鹶HpR:ӄUxe[FxkX Ju{Qh]eg\t1B 2Uu:|>v o{@c$.䨈q4':%б(%*ήDv .R,/^TƬeg {/JmFѯZ萢_,x8;'^3/4mh?$"0'4FNb'A\&!8Li%C*ު9Nh.3=WxFnIZ[2!, &%mdН-Yk)gWJoW&FˇR r821``̪t#VXdXYmddqNZOυz =oש !6:L+ RHGOf@[%Kj jY'u~ҵqZ)sςDU@ .r ̌yAH%Iv|6QU)5x&"!2^+"!&ap*:RV҅/J.'*hU8 s08I[{.-EEZc ,|/9eײS\Z4_BRD9'tⱍxX_E:Z㐇JlY:]NGb+[jzs\~2KHMlbɚt_to~/8<>ږLT*| k j2 /^Fc4$,^L 5DEH HE$HE*"^Ib 6K AHRVaB*JwZ,`ZFL&Z܊gx9;4z"ͦ&dhµL3^9j:߻Cd,\ح2u jvK׭޽dV7pj>ڏ6wz3 w1ajкezs0jK6slͺChYB˲]wiz|x{筋=/ܼw7=.k~oC2ї-l6Row ]wWwVyGkOV.أ% ҕzm=:/0cQpFQB{[fKj`+N'9=`-A+ޫN rH# be}0є1тFhH8HlFΎWN/_4d:ixÖl74nxKs=l,D{+=L3d0QXP'[g/S @qd 4*"h@KFR*:w"kqW|]Q`{'_rd9)SۮȖ,K߲E1Oj6m| )w{U@*u)%SJ1\rici6!cBrHx rǗmxg{2X!c"Sz̝v&P=5% ^谕uc)~\:d e%FO'l'_}9!{HʝNzy"!JFB8oFKg\+k)'@p)sI`K)7jʤ͐jcH|40Kt H7,O@\u#H#mQs$o@{;"zaMA-SA| -mP`2ی'Phk~|R{2̒,|(Kp3"RM22npb΂wDðq-AVsњM8]=[[Zb[y7Ek`Cn:gtSӁy̗ZΜ33cf<cX֙C;⅙bcWZ:ګT6RyDmqF+17Y[ϒi-Dz_^ÌV{i 11=Ъ#-A['/A"IjS,!A=&Y&mv|.vJK::諝O.UyPTOǼ=m7 r^`UM#!Ȇe hR H9."F1\gmM\k>̮' {KڸW`դ.SQUa˪Æ^+Bz]j/Ȟ|amC6E̪r:ՃϽ%JL TZ]%"Ur&׃kԝ[ZDAt#cnz5IYEWUP[X 74]OyN牿DOKy1'?lu(m' . a,[mHjy "Ş";VX,)vGE3urM΢S*\*wɩ۽'\/STDRy{8$с~v.h@ :$H*%4v$R(GkbiH5؁&x#32x5sf"%colQaZO9fz:½{,|f)]~mssmڬ'iPpp \ {hd@a2͌'ZK(I yI,1t$LEt!"'U6fI`Y|M\H;!hL> 'Hv%%c}4J5VM`2X汯Ǵ Z(QHLd$&F:łK UfH*5!1s:udN˜9mvI'X)M^00!g&cs|b$ao~ n< Dl{\7Ϸo`7Mo~+M:T ҏj<}8=rBovxs8Lu' #\HZ}qhE"[ǡmm LbZ;Vs {RJ//;?ׂv>EsG0F^Ż/ȕIOTD{]T:X܎ R' iBxO~ '$`AɈD."JTJ+W'$`ɈD.V"TJT^RSNH\W\~2*Q+DU+Ct^=p;r?Ǔ&)ozf޴ ͥa]Mܴ,yf,-xZ[Agͬ{cMQ]^.} 7L ^ا=Yrq]fRXϠ3Li!t9.BCr]!t9.eBCr]˲0B@:!t2B9-sZ!t9n!t?׎wli@;O+B0FӍ|@N! ^I1s g\"/„~b"㢸B1^Bw^,RьjݻJ{pNQ).9d>)l)<QM9:<:yg84/Z%ZtfӗjSbXHzOk$OQF,B0K)Vʂl&"0+Cʘton(++Z-2<8 4L7{mSR12̒,|&W8Qp/OG\yߙ3pff̘c:sh_2S0Sm exէ ήU6R@hDmqFe5v,Kcr+[Yҭ6KNBb?II$5 0 o VIhZu\zklTm~=|L̶Ij=5Զھ,:6~7m{L>ULT㯘nh.vJK::諝ȞW|?i㤇Wǝ›=GBj 3nrq]Db 1-{ޛ9099&|]O&*q*MG/ 6¯X %u."s"2Ixۓo>O{.ڮ@O<{p+OkK։+AW #Hz+LHx`dXw{=kP:.wIT8Z``4\+q7)1SbJ[&`l9P0Z ^W)ߊچFS3O1`kyGDJNcRd 3âҌ;$8"` (O nqzAFI`2N AK֭" ^6>`hМ/gQQPT{:: ?nG.RM0sxArJpN1L J)|vRX1AظH2Soe0` M0̍QH7f;ţ3@-Z õXG8}uDz\(fůͿŶp0mo`sPg2x}0`y ֦&h\,e#(!f&gRqXEz0E,( #_gĩҞOzdgX!6dKr6Ag$TU۳j, uޛV`UE-6`MH閹tHܴ_ Dz/rcZAa#a Z&s-&"u"j)tqq>/_*HQp3Z<CgJmeʏ6xB1ʽgj[M[T4w7]-HI5]se Fq8t-ws9Rƣ}zwiF[PHΑ_?U0{!;Y^,}k3f1ihx;n`dyCviXJa#i`K]L:g7;SNj:ۅL.a9?|w߽Kۏ?~1Q经 O:Ag]$`l~y<o~|ЪaT&ϧn'|qY1Q ݍ[5Q[oGA>H-wZ,A\I\<+@6US"kP<5U]J?{׶FkTc?8ql8uزdJveŎ(6HpChd7T  |v+Uw^Sj|5>z&^#ɣH ?xDP "T"5^2",T&+UwxO8Nm#bu35cqġ"?]JdDLP@鐈&pHi04TQ9q`mӈTUНEb|Ɏ>mgܛ6蚶Vi;IۑV9r,Vl}Wq5K*QC/]z,}{gn!6=L~o.8]nc.j:+r*{C#ʽRWJ<^sƉuZީ 8(>q-Uv@m[Ex{xLGzt.ost6ϧgpȫB:EtZرT %Fxx u@hkq#{v]1K u8gkk;ii9a:}a.\}ɓo X)DK.:QwID#z ĂggKDA$C$) Fǐ Y `5IH m)5ڄ}$1o($%ҳX(gХu.V3qv3[!3,;_|1#l-_%hkqωzC|R#~jUM:@]%'>ΌC^=;5?A|~D@6ҁ̕%7k!](U!%X&ЩmiKPl0lnj r*EuZКˠ8ZCbMhmؿ\ئbO4`98s=' S|6Q wA/j%#(#2P7wzf-ߟz) M_kX-0AEl"+S&uIsP27GֳUQvJ+䉜%: (0;QAX0gFvzL=lxUu49[&j{>7QHY bZJoИZUբ(*B؄œdc*T_=??N+R-)( HAYmcb`L"~BE4?syZ͠V1NZ*SRQP6HV (RFKBg~c61bl6' Sd?XE{(YB09絭E $OV C݊V9+?P9T=R&VX3R(R$/\C g.$Uh/#Zp!@N*yQ=^`Kw?fȡakߧyOqسI7k-'˄1٤e3D!0R@SAn᧞K0c$ ߉KHcJJ㦤1%"E, =!nQB{P9CM IˈH8j 9bp>2L$E%Qp1[a+c{$"ШcұI t^fmzy1h\֊O/QlF#N6?6}Tя3/q#!:O^e[qZ/ןˤEyZ/aղU/eZ?z%WUH7UQfkyupwRiKRAjU b JHva\Z (.ULST',1`<4Ia&/, F&tlgzH$t侥մ./zHm85zbfw1F'ؾ·|$(1WP*~%ynk7^rY !#V:(,UVK0Y1nrk .E#*ʩdh /]!gkBϹoՑR!-d H]Hh+SIcxL=SWyH7=z\+Rr~3Av}Wz*~N 3Sfg 7޿ۻYܾBnmzYOw*u+N`1'9uǭg㯭ynbws!ewSݝ/y013oo~gOBv{ߠʨd=NMͺSϿok$c|~|ajbsVb<ό -XNHv(KT].uA1v?R رbw?]8( h= :&!2^;тh4ʊ$|=5Y (@.Zb K5([VP,I$0"|ؽ8{Rw7{t7zXg~hvzuyP2ػy<% !"&GDI "rF! RV`2)^̓^J}!k-{%f &%R&YXJ[7w5|/}~qsL1%1h5D9Y'd.eUȐK()F2p#ORDiIA8QZVS656T)CdqjD6rx8{T[dNӧPs<9ꉊ5W.yG̢U޽}O+sJ%;0AxtR8De*e/ ϗ+M-EWJM`'T䯄9 .V) Fel&[QIlaq-\]nb h j J^ z/ybs)ZoAo\_,v Xr:FUhںd!Z <‘@_ ޖH[dk-gv=aVu!E8!֗O:;V&"YU[F FT JxZho˗:tc-뀡u6/IMT)&lNgd As?NBR1A 5mԴQӎigaGPlgpf皷eE0IJ3"fgC]U\ȹN]\!<ɹ?97gۗg}YZyyCMxk%A+|gT$~1Β6Ah1i&!(8ѫSYߐ/Ί7X&WgbZ-K3nɃH;ur\_Ş%Ei\/m?9sjF[3{csY j[ ¬زQoFHED.Aʰ*ieMP%QxU CRMdod]A3y8S~2KqmmZb7nDkt[u3n!>r8z9ߣfnQ1Gʛ֕ڒպHr*Rh'V={~y:䫻XS\.rEwI΢mRbtFb/St&E1F,*rLE(Qʌh=8Q :G[1u4* /'%7m{^z10F ш`A2ZPܐF$GTE >HT?XvuȴK4= c]L\' -g Egm4'IjX[6Bړ5+)x*֡p,Ф`~K1JlBH*.=M@k+99d3 ޘWPzέNYڼ'ۅ o/ߧkaV7,c?qГWVd`A$N:|, F)L?㔉ꔑulȠ"5h_N'bPSF6)6Ii8(&xiR]tp&tWG)-gGihٵYr@* %RICL`jqv!n< fC?F">etNf(Y[?Ay#WB/_^G}Tq4 %,AgGayw jaK'NJkUMy?LŴF&6W~/>x cr!̙|:ӏ'J{=KƑ|崮Bč;L@BkOVt֍hFn';FD#XF|'}N=i8:ȶ^۞tkNA=uth|JӴ0 ,Wi_Ө9񯦱QsyN/Nh:嗟}xo˿^}ww?/4SZR?,Yss|ar}c@Oħ/{,s"$"*TM`".Ӂ.u@6du%$ Wv354Ϟ6@,xQUbϬD5Yo{~4F`Ϩ` ̅x6K?%EZC/XRT/X ,*fRc*>NsLS W!VLU> scz{c@F@z)S@nʤ7* ᤧ"}HIƳ%2w8)ޝq'xtakָڡX Q_ѤErqD)L o6lozOPp{m ԑtXgo>z^.-G%y?MVM+/b,@z1X|{ڸ܆AQ.I Ȑ3L:2ȹ~ gw!C ԝMĚ*vwWbCu=Ң{W*76ӟxS2g߼x/y9oUQ&أw C_7\/~3:!0϶2ɨ uR务{lKףFDe6dE)?# ep2|ݣIYpsfBzaxI.jhϗx>h; Ul1hEY'f3"+b1hݜHzlzkoVr݋rLwuk?K +ͶZtU dD>I7= SILB>#F4φq{.LHЙIi7 霩

    H!&h P  )h%sFrٵC7 4b{쳓,֬xįg&ĥN`(`JVFg\yBa39:%&%VY CܦTxyo9@"ö&shgtVym2a6Wݵh:l qRpq6'֒-Pn)!_;\mTZYll޾iݦ?\+<|Juf޶R'rRx^]5wos4׎f"ln͛-Z^s|io~u3o3 moq7vn=ٟSFǟߺmb7yL's}8gռ8[Qq'2%*u]8!|>[So}>[~I. BسK&TNFF:ݧw=wwݥSHCm#(}(q.*+HqT8,e&ﬡMp(@x!e2Ic%Z8$L#Ӂsَ{glޅb|k@i>1c9ͮ0>]NO{pc"ۓ ECnNQjI29plLҔTʹ5@yE-i3qaP!t&XGZt%:$ђU}>c*:٬+qWDž*Bn1SPHC*pFw'zt6;a#G$FLؑ\8M^HHj*R)~U1/8Zp tHKr1&Ns1!g}J:[wZAMxD.C> U\v:ݎU1`C%4 8*@hQsLa>8>; >Ⱦ;ͩb[|[4 ҂c T,JƍKȃ`.3h 2Ȑk !ʙBNikF&z˘ˆsxBnFrе3q6ŲRcI q_N3Ib\*T|W_nlr. U)#,dcF 39@yR3;JUn53ͭfڿ> 81A"gZ2LtR͔&'hTVqo^Ṉ%4G(醉 LXڨCY!0̘A=mʎ3qԳZ$_3Qc6*rm1zɹ1R$ZpR,LI(f98r_-[O=uj)L+2;-;("g&XǶP_9 SIR %y тqRH"<];W-A;:!jX;Էh5a{8LJT#0b<}=lCnٷ/-*[;;^X!OXqMUet\aQFC.$QN%+V5JR4קJI 9j%{$"灁hMx>SF0 d*MLy/VxX}49]=,Fk29rYuiZ9&΁ԕoSbG&~d Ed:)m\Oɹ;,9Yy֧0êі~?kW˔,IfL=DçsnmDoz*NpzOwJ"{2.~#1O% *:n9zt;=>߻t'm=|]|X?`浑a>on5i>y螉K_gn`\<z;pCYw11oϯ׵f ޡ@uְ`sB`s>S~> g ,"V*ʀ.dA={ѹdBdT`ӱaaw۰ܱSaw; . JjE]TV Q$a|YˏJ4 -PE:At!v6Y ,4kZmXmXVysF"&n'e1IƍKȃ`.3ȅР 2eg7A1JSڕ"kF&z˘ˆsxJ@Bj(SM>]&ŗIN[ح{E)UK`JP|YhtB%0a1$k3ɞT);γ2$wi0?vK_΂}JLFH8zǙ b13'C3B%!arBR(Swűdiʢ-ȉ#6 LX !fLɠ֞U@;1ّG{E]RTz{,ՠYBìd+2 VYdagLBm:`Sѐ T5_EG}~oǩY XJ)JHϙ €I72(ZU- '5'GyA[t9(Cf Ed9:2,x4,a6Xhu"ݼml-^`ēlSM]6 zR'jx|Њ$*Tp*lA't\blUys9aocfm/HSIy˙Xk|P`0CzZqe΂eDAj}"y$]/e`R-'96gC1XG5 9:TePqnQkp{АPo&'q"$4;MjsN5I`gs \riY"%vJsT'.8R9(䖪WwMVMF΋:AF)PgژZ&{[MaH)SI|RIWEUuc~X_gd0W4i[+iO۷TTs'd]+%ۗo97*(QkRHoMHzFN{z*+N)IIM`r%X\Qd(gO19xHnJ"^Eo Pyq8,^t!wFɑKek,s//}R#M"3`rBNĔc/3-] =! 1KN_ )s&fKϥ93)j\j{jw4H_~d/>{WVoX,|c:or@箯g߮'o=v \q H~ LT Y1#UI'BK˺(3m(P|B66Ad{ m`XC,28=M׮zm^k[y_)᪲*vPWJx! XТtCҔӑYEwt5֨~Zl>%fcƞU)*4@@()Ő|F&$\NVC>@ ۲},YZSZY[ b9VK8+[NJk,dt'7e/*V镬4Nl;;rLZ 7(1J~1ϥS4]_ (p]./C|7^,ʐ~SJZ.寕>~yM;dͫo#?NWs=[00`擛ոgn&}kAiB[g_C\\ f͗5iv3J!F0)ȣ` m4K%E^OV\AA쀻 jM @}^)/( 2,m7-vJ? v\~LӢxϿenf00!f[;V376Džf?E^5N zQH6O~i`/RM6W{[Rˍ=[onUM<랏8BRqIKDEzqcO,>ַwH4gaŞ99ݕf>LlsXzlAè꽇400ca,jKbd ǎ/Z )^;uS`]ks'|=P~,jvNgN*~wRic#g>hr22/A !3DF~ϓ9w0bBȄTvn{ᒺi}rq7R(׍j_j/Cթʧv>e2q!ArRNs& zz0yQcU]_}eDZwZ5" &',ZVgO>x&8K(k#YH!IZV%}#8/heD}*0?8ǥxq׷ΥovEwLP{TH\x.}zc2Һ&so_v WS !o<_[,5;ۆovy:|Z#YzHxVv+?֮eg{y=$`w&/^gˡTpz ︸w~7tO_JwoO"ܞe;y1>i^&0jO#_ОF4G?z<*ˬ 7~!_TnUv1Rq.'a$>pG+ ?] YI;]}1 h?/WmX [!Z+UͬR0r 5jJNԔbvV=-/ Uj:T PWU*k4G<U9A2%yY>+o,mK ^'p8A}>%!ál+[=3cH=-fwuw]]u+z*"eMLj9+j$ѷPBoE/c*_^򼻯]W\ކlU3=eΫ[TF1ktNq #ס`iW3N;ydaKfIF8CY8EC[iǕ'uC‚j0!S&aJ;f2Rʃp-1Kȳ}xe%7="^W&HF.+_7Wi"J!6t]m}S βmĕ7TcIo" ^=#ACJ G4*ŔL TD1 AW%wOH6'OafXTq(" (O 3~=+=EOY~u+D))p*`{@xj;IأA?EF\iBr/h[N Iw : 9@ ̴k- lRKFڷ2FP0` MR(an*G(옅-vGg@k- s2vԸI.̳?럥[%: Dj?aB2}z N26i6w'' */_PYɃV:AHA٫7w9;ꈥPZj|>e C 9d [ hp.0uSҎ u<[UP& -,I7Xa!G9g>Kyr3>L@j&خ)\gSS1Evp~9uc% T'djC.75Jn6 ]Hgg7jTa45uV7O5x[~_4]U#%t ̥:*xt~Q-7kɊIռrP0rqǡxjQ:t6 iFaV#PŤOiB7c.Gu#hI64^Z&?xXHsY@ 47q+cv2" JO_)ҥ׋LՉG:h`•uAn7_/ ]Ew҈.}B9M咏+ K^vZw?&Z6( l#yAC+ 2%ƠekdhscZ{z\:E3J6s;,/ct8yȩ҃kdȰUʱ5{C)UH9E%#-CəQe~ 'vHhv d=hNA3-6ʭw `s~0ѨXa} -e65^gw><iHN*.{#/zMw(C0!B*CHA9cƌf{-#'hj4ěxc[ ߒ?w{v(e)VK@{[魶T[m$ZRj0x}d)<93Tל3p6sZAxag#/c^{^W^(m~Α[2/ĖץkE 0aaFddۄ=9a|}w?f𣣒9[xr~bE[3k\>M^Wdvt3pg(-Y~-f/QĒF,B0# be}0tfDDLt &(H8HwKvg^¼@ȶ|Mv4O~(^k4v%lR'>-ϸ ;@k :9,"3N\I$XFeqP, g(@Ct0Ǖ';)tRK{]][Z'HF.+_Nf.x7_l"$Pa6V%7g!QH҅]` 4\+q)1SbJ[&`l90 _)(ܳ| wD@R٘T`F8̰4N+!H&6GcB(@ܪEU(cK[$LMVۣX4gKcTc!oNuzl.Ѵ !94j-G$;LriA).Z $XR{O L)R)Jd($TDŽYbxtPp-vηwS'ٻ0nTY{\'V{B[sdP>>X pyIf>9UU"EV5 1N_+V cehkE4K3Ww)Sl!N] %n`2\ΛgJ``Utb ք3K Kp5,ȝU,aP(Ef|ɊN ?lj*&RҎ^kp~9~c% tljC.75Jn6 ]H)fg7jT]IfԴɵo+կՃ7W惷}`.4_GE/굥fmW#Y1)~W^bPf(kGl#d0؇bR'XIOb˛1G~$FmkV/G-C=,h$ }~\C6YbWC<٤YqSNug~?{Ǒ.m׃b9' . u)0i+kg䇼X=d4J HQY?Xd;~/wo?o7՛?~/<',rz)LF܁>W5VegweW|+!gߝV\cˇ&wr5"ޫz{l~0-)7fTLŃԋR B Ka>v+;P7٩Tl_\iG7ۖOFJPɉHb@dYZ0+a}eI 6C;"6:<&6y" Y(*(B}LkIP$mMJIfaOɞV5[ #Rm}*w۝Y`λY|؎RW\t ۹V a; 1̔pZ:o[U=ǣq30X=^x4*IW򣟼Px(/h|z7I;)Z Qɥ~NZ!yʿ;RB EOz(R^ #g8g/1@Ͼ> _*R0dKBwΚYvNy1,|o[G{0vgz9uz^u/9[xDT^hztyTX챙=hF-Nx>8:eklA^_Ri%WTjq_-`vKltA+)忲4$$rtKn ͇sx.'ꕻ 9*`lȆp8SC&Uf\"Q9f- pH!ѻṯyB8]}Cؘ_ ٢KfB +^Yj7.ܛ')āy0| ˰o֛_\r;+/ s1q&;﷼xC 6ۓҢJ<w!kg0p6kNփZH*و`1+ *oEBh!m k"ɟ -(+6SX$NRV>PhF: 鑀VEkN]y9u Ay8pXv˸hrцБAj6*vfk̮lVUj*-m UOg ptvr\ʧfG;Wt޼m`alyeɖ]_2`p]a"`3 F]I#((ݸdʅu~lCy*M }6`Q2yƐ0tNf\KE'VWsduI7N*L\E><:S/lF^<'COUz}x^}0US??"* 7 fMtN*[KR9.qL(Ju2"Vz&yl1X $EI9 3Q%Bqj3rǩ2G{tInU?9y]PDHb%1VwIVR)1GC6V@SS EV%$AXo HHς3, C,2e#xusJ ԋ%W Yx6kXkHZ _!'Al8/azI_0K֣Nu-GIHC&)3YBw$6̴A.+NxK'X+@(/)| sCp틾|/MơEeLe .t@JtH_Y&YmM#x9^n!,(Ka4de#1+MV:))*o /m>gE@HR!0,U)9[.Ia05SIcxٌ5CWy%XO{'ýz\!9ǧק_d>k5^؈[trJ쵑2;+k^=sp|s~n]"$mr>;]4ydO,Nϭ;n~=䚗.\_-;nVjxzY_> 1XK-^㣣+7{uYYGeȳ}׼8|ptMR[,yM׍ܦ;!B>|&l lsK dIVrʰI X` ϼ)oJ j7p}IFL:R EJP6v^'I\np5>ݗmQ{eO%H()aP,ʈd%e󄰩Q :"( X -) S4(SrFyPLΒ0QEƮfj;V5O+/{gD|7[l{bw6Kn4C* O:`0i0<9 nKL0ܙJ.LJS۾qKՃܸuFJS;#\+쮈J.*zWG\S϶(gS ]l)wQs䴇 lz_oF㣓iģ9=#; #sϝAZj{M8q1}#޴7+oF}cUbfT.DŽmim*vG\UqՓK"*(]\U* RNpW`;#\-守Jr.*0(t~uNs0|tt<~or8KtUt`&x|?c{BJ`$ہ5 *QI+2E Oŵyޟx# Ad  X4Fĝ-RG̱h BXj+HOMW rmTe& & /%MuJl6~#pEȡAAO"YTF_!BI2+Ҁ{{%Z6NJn!W\Hm*Ovz3r+G&o%Zٻ6rdW8mR؇9ٝbEk"KIN翟br춭x@,v"A&ˊHHɠ0'BKZh%}̶.M±P9K畃`Re![m8kpߪ+yy ŕ^?ZUاK<֣W_O@f$ t"{( Jr9e/2g<)%Uh%$7]( js*QɎc& ԯ"uQ-oBBR HE$o,+NޔWU,pY&o2foe S7?mls7 1?OEs`c`qL:Zѿ߼VӜujKe_^Q yQ<OhI!KM];5jPRuHtE>Œ Qf}n"#*)DI1`qI8wS[q8_v#(O0)&p4̑͟bti7_V͛Fz' 7tou đ*es'>%Zwk}Z_=Z]^3v3?'~8_ n ^ՠj9/i?h:|lxޞtp'yg%X-v5bYk ֧Xl:{~irGǃl fi[/?L/Ŋ2T꾘KڪWXlVmaZߢ7 BKN5!^  Ya],q?R_뎑V0D^2F(;Y:LV!U2 ~{acV{3lm:G*(|k#RJc"B>y FÚN/k:8UT`#Y|%X3;C (VI>8hӌWhU+`Q( dJZor&g Шyqd}g٧i뛷7u[>w<_5UWtɬ*CT\-ڤhkTnmft@sfP^w^.7a=[1fVg<ŮFu!yIa7["c;oRċe郹ס 9a=6тoH58Chr:jNydHlXWUn'!ߊYFDf "d!QşR O-脲c7qQAz%`ibK޸Z|bJɛjRE^iӢp4ϧѵj{『^y~2tnb1 8jucuh&2I`B>}e%C%H"4:1.%=%uoklw?BLŒxy2:k=ތ9u/mn`Kv6?ozh-8|98kT`Y@@uihF. Ѿ5an*vI]W`<<9I|]'PnF߮wo.vջtyڱ]enx{퀌0g/Ʈn+ ź}ҵ<}T':Ĭk{)1WVCe[?~z/afݔfbqKi*=JrUJLL9tyI4SNݯYy|-#PD #ZQ,W^Ho}!i'[:oK;+NOO& 8kKL`K%>Cb|#lK70 9X)5C,Q5hϕϝϗH  YS!7?VJ2F%pkꄳ܀`3bL˱0I$ ,V**FNwhoOi)-⁕:tgǛ̳=٤ӎ<?gN!"}-2!v;Q\? b:a/ND@7ٴ { ݈/iK%5d0$PvfuU5M&yެoxqv BL$Us^iHI/iSR(Ԑ"E_6zu.&{ ɦ+s^xTNV/)d{^N)_FDiW dx `") 02㌂y̤d߻g?NME`q6{H{My` h~١ $''r!Da')CHBv 6^$,[C9&[00 A%=߆ S|2$Z= pFUce,&8r 6+#4څP7<7#aShh=`ұL+WV}C"f^yWKP>`*aI`jFAIǢ-&6r":QQk1PAcDttlSKzپ.U _h:Y kZ$hIĹ16Ee4&8 Y M ^j,)^_:2: {ynF5ϓȡÕ 7wHEŪT v&!@4BKTŋPSèB3 U>}9Yc,琹*؀W+%nk2^e1FC6 G!K4*_WY _dŸɠ EWS{.gٚː3E:RBԅ$Ac0֭=ĹZ6^r~NBvNukgZ:[l^?n9;v_,km8pY"`ql#1c$A`Ub% %YS=Re!1LY}Kz\@pft]k]OƋ%ry-I+SF`6wzU' 0^sy;h]@juh{qU77f!,eQnW=?5y0oZ!hovsӘ8 XP OpX;Ũ7r˦mXۼ}~+ȚGNU15˭Oe+*6i2]dK@[pO+Asߟ/R?“?wbd 9>?NoxF.8?ary\j Q)Ia|QQ&ҷ lp +>̎g䬜UW rUD `7'wlPgswc8:+0 .Lm=QYeX"4xWh娧D֨ՙ,nR.x.`61l~Oy@k׏x,9| Vܝ>c jIIļtXV[6Z9@!IC1sia9N?`.0 |# T)J|-X@K_2ne'{fZ?ˑq;G5*v?9˷Rya$с~v.h@ ئs# wB^KP2zEPG+YpQgW12#XP1g&-"FrYmg) +fGNw8!ieCtQ<&3}g-8̱16sE$Ś%4xЩ4N|\T`2fSk %IlR!JD{CJGt`ZD"rREvm'cf\vEjgJmWj[IbJH"JpA,yQmNrh1 `*tޜk' LCTp A )tRp{U*]ɝG;9aeԧ 5a*0=vF"JDY"^"qǨVrT:"Kp`{T;\<[4P9kk6Hq PGT8&9cuXҠd0IrgFvpx)DPnN{Y˒N w}G`0Bx7ELD t~ VP'Y঴邓 'e- )rwpwJct~5nT&I V5NJ^=L ]> גAs|;} 8ݴH0Nݽ[Zb{"f= 6 j|0}<,{_V78g̞1Oǘ/,CČ3;3W'\15l+ItQ;ܰyhL՞L6K<]7zRKR te{x awSh/`FwabJ؅7G&VG[=o9Gfuf$աw jEЪ3h1ocJ)ۤ<̐b罓*)DDtң}W=sKM7(_*1o&In'&DϑcdC2i4ag)[qjH`LK.3L&xNLNigGH'/l꼬:m̽sRHCPӄ 3J s-&6+?$ډZDRI?`m 3W@0lU"WM *eCT$,وD"Zr/D d Jrr.*Q+;o &*W#x˥oR\\Dpj7rىnr}qc x/^z&_;XOmC!Si|}5vh:VXpXG lW`}77dha6G8gxq !W Rn-)_yx7jJ3vĊwc 6✥yLUn)y9e[<=S;[Ƣc\V\_20fw`~hǓr>tj75߼&`v}. &ctw%_BbՁc8vI'@op:PKP+q+u}1D5Eb,˭.")!2VYkٯrUn)iV+KeQA)3ZX#CVaWIY{ɱ5Q -9+R 3|僱B؍wJ%S%Ĵ>%Gv'`وD;2D]WJzq%sW@#j%]WJڋ)`7hZzxZd:xUoCҾW[_WWR S׸wfn M59cP0tQ'^ VZ!7ˡæPjrFM4%ŠsuFSu{L3*TuM@j埘gy=6y,^F ,hawVzJ_ nS}8lgtHNmE.^A | י)= ^EĢO wh"^F  ޡB#ƙJjɛ+Faݮ@D)=(:Vف}hi.k~{^7]X20*P ;A 6 BQdAN()P0jh6(Q$0YfvKf#g $C8(_-Mk@ #Pސ'):t^QKT==ӦXY7S˷;JJqQBP@IU!8msƑf*84Is7Ta8s2 UQm F"80j.`L9f$$+  L"# E\1 A*!,(tif>FNK>.܏͖ 6lԔxU5PT ةc-W DuZ1hbYYkn!Oa9r`q C Gv @;p<3"YPVވClPGI+v~:B,v Lt1pL" ryX (ZN5Hг`Iz-T^l{bϼtKWwގ[yD38 p*3<(5XF&1Ъp2Ĝ=mwPDeW/ٗXfc5` DF҈& R8*` 0^@L B}}c 7>v;}gweRΌ. ;&EdUa=ƔkDZO6sǥU+ 2.<l"%"{ɋDEeL!)ؓX0Ya08SIpܐҡ҇4$n9Q7" 3'EF )IHp:m$z`ۡ\yb,-@A1>"9 ΌS] 0h7;~Ck4ѰN=`uQ%e\DŜad|2!\.%6!`BLª1c(V7AB[Z/5FrӮp Q9@X1 zC6 8-ZHF"3J0r![6v(<(" B뙙Of8. åk4XMVj )d4&j[[TW(RǺcVD'kF2jꈑZ5*A ;ETc/@#yϐ,T¬G~k m%gP<;P#Ji w1A234T4jvi G 0|ǹӱp[mmknUwwٟƪa8܍+{)t2eG{L)~)?۝9|([cR;ĬW)^ wE%K\.Cܕj|֮WȆA:v/<܄`vCui0xiJg__/ap`h[nl*4_`}8Y0:VՆZjo R= < dAgT~ *-H2F0I Ct, l,2BqֺL9^/Sn&;wʭ:tʭ2rv)Z RPgQ"%:Uk&s*8טyAY()ΎG EέRȰ, -J))$1wllw5.I9ߡ

    (1Ôj@.-$gڈ|(8*E 3S 2H"9nA?"gx[[-U8z0تblO6nU|#mtϾG}yUukVk jy6[J TiX͒q=Jyc,\N5WDNh1[E7c$^=~?}ް'9ivt"\ F.'IS#3Cɖ~eKkNCl`;8b+ ) P#$&LF%K:/Ef톃\.:vڴ5ea'_ $-fZIӢ*HBl-M6Rhq'cQZ-)4y}^0M);u`lR3 @m186^8v˨Br.)J]l.M(Q =,gJ-g~<V ϵQWi1mbPky-z׸̻*Z4X:k*D uFK⽴DəU P0 b|i_\ҝm{VqQi$O^f <$TX rDFԣ6GQqzA?q)B~YY f9ki"<)lP57`IsReo[O5mH{NrFx\T9>9+&X,#rЧ}kKxB * (TZ҉xo+Z&o:JQVI.i>Iq}gnVz{t' 1gz8j|j9@608I{ߓ8~2po:y,A<-ch]/C  lըd a4wɎ Qx/x(^7_+UqzB1Kt\Mn|np"6JN I&Rq21]3lo֩Owek!R7v鯖B''ͥj k Ƒߪ9ޣ r3bhk't1{iݛ6Ϧ5Z~y_zr\x ŭkb. ]Χsbn#pu⃳wv0܋8x+iIp:Glu8O8y.>\&zٟ{8h8F֏:{ȶQ۞5k棎e=x$,qm8k aequLho85>N5k:U&gY0s\Nx;珿?wp eV Gm$Ph~}~zк8lhC3Z f\Q[ƽ>ZŸ2=f ?~ |u3O3+Ql֬~L vGJjW-< +2V#\ԶrJH%8u N<#x+q$zJp4 _.Ӂ[Ec M<})C$`!үG!9xm֫ M7Ә훟{;pp*xx9<` y /9(>\E 0uJ폣R/Q-EM")ˣ\( k>[q`[kB;/}-E[U4N#O'y>-yC9VPˆF-RmoIs::~6Eh0 {Sz}K(z<[QWcfx=ͬE|zθd{SiË8嚼Rbҡ4Y (u2CVf'<1T A9SԵGa),=EڥE9ݐ>GGz ;lTu,SD9h ܙ BDNJx$7FO{nnPb co]o#7WX`a`pM;m `? Eں%Gg9b$K[eʒ6~]_U=|)!T|^o&>v^J۫oͯOїaKNv.=/WhX)]K=vrnl8Q-d,԰lږ ,|3lMbq4sۤr8`( :҉`UŌpWyV ầ:\zz|H\BpđHޅ K?\#qe88..@> mT%20 y&-p5 +V:'F>{>)iv}Ά80Yd-4[ Qi,+QV{m"#B,d*97L!WZeU/>U^BkgY]Q1{)6.dhW i i)!**+/IƊR+ʬܹ zvZ1q FgLAdfF%3xCm9{{K%`sy!LV3Uó)ZD!!2Ez znu{$E࠭Ȥ: g -(BH6D ϒO 4DQn4}OA&tb<,l\ws>_PlYE­MצkӶ6B95HG, 7禒*@.rID!I92(8Re&NC0½geɕ )#&&lV ڔNR)Eۏ\LRܱ֦Nkwv-[4J[w(!c H8<aORʖMK2 7Ȑ u D1 $'ؘ+>yI.шc[ kDiN#bɔ,WkIOI-!$4qDar,>W"ˎ:.e}w39*bd[( EN/^yŢ\Oֺ>ս%E5uVY^]zߞ BO}VKOOwWBZ\d6UAR#r}'ۆՙl@tћ MgΗL&b[%£-X*p1 <CNj੃8sPJpB&Ct .*ReV1'XtGj@s¾x1r6L{umh[lz.__Hcwm<5. a:1MPKxf sP43j#wӆzt癨ݏ f'YpiibB ;\F1ן@A[(]9?Xh˙-3%PI!ު>Dm%"c~ǛTnwkYBF0p\ BEp zkPi$!HG$AΜΓ1X/rdM1EW\fٞ:\@O *2%OAhNFI*H єjP2Ka {< {"T쇙)a0rZ 3zHUJ@sBpKI4B MA䤎aL&B`x&4F4yad *pI{p X$0]|@#ǔOˆA&z'c3Pz$QQ@F-%9:(C#7 ?>LQ^ŁfEU)XO] $zsG&On.ȅk~8#óAq8S>.^?Gh5zN׽KW+^~#oWkW5nn歽Gap< GF1+nmB;q ̎Oo`T~6&35Y~n|s5=}n%Fkn ?Ly3Rp7 m=ߝbrqV2u$!m>ҭ,3bQp,f+>=џ{8pum=k.Fp{HX"}4֨zިg;75BY9_~}7|2sߝ~x Z LQgE8j#'O x#6Cs#6ђO=M%e;cvfaakc $̀`Zn&z %H6E)Uy̚ߢnoXZ1/@Lz0 / Ax2+PYdrٿXGHt{gGU4#٤(i)lKTS&##An kQɕ3;0WyU-=b>i@W_D[KbH,$A1A$ sn ksš[)sPIFxb$#&-?eev[ss=KKڎ"iRB9l0krss'HOا"Ӊ‰(FY0μSY@@Qz!jnhM Mew^NiT6So՟Z|lgE[v͞BwǷu҆Jʳi*тF;HuAdqWB #P98_GpNpy jfxTk"Dg{< L Me EPFx0HOAdX"s9򽥐^ XZ-Rf%'>st"@#h _.-Vء=#&;CGh&&AZ6f+#= :+dʍ"ET7*A0PSPR5m67hkcmZ[)=J>ҮΕh.*5pR+eY KJ8a U[k4ONm +k̔V;no_co1Sjsqޗ8Ml3MX/@zW my7? G_-+ڙ?ڑ]{\ O^-3jb6De$;ı6r`|Z9^󂥗Vyͦmm`;'ǩ479feOd6f"=⽟c!9;Sv57 Կ߾ioS޾y&NE!^O"XQs|| jF.#8?}ӨlӨ=S2E8RGA+ParV۩r`xIm&h<}:;u 6E͏LI|Op޻5k;WF^BsoK/{ףԼf Wd;LΆha7HYV_Gtk8BD* Uyq%Ly]eU/%-\//A>/maW 9Kve)ற=9ZsAv!ѿfTnǕ yZ:X1c8B,+/$km#Iiʌ0Ч`9,϶ܲ!%7ERb.7زb)*_dEħmj)J:jRFлt8"{1E*ڒI7\v{?JaljDaytGv3^ͯx1w^>vy:=zg8q}]\]~uljs(7_TjcՏLQׇ; l;Ծ?ǔ1KԬC73)7?Ln!|]O'pp}ld5KC0љ`4/󔻿{ܝG@I-gĤ#ƐAlؙ-*HV%TI;@&['\qbFK" [5pIι{7qg1rv }r9;e/itfBwa|Cz"tD,; XMtO٧.(^YL5r-%q!L(Y:;jvfn;FH4UtJʇ}"?^.Ln w#vDb1LS1P)ձTіPlDZ Aa "O5>l|d{GlRZ%V.") fg.ƤD #D.,dEOsLc7@`4A~My )m! Cn{srǥ[K@ȭ ڸ5vg^l(BmkvqvVf'[T"!24 u@O8yPQK>O%)^$eD SI($R)I ӀqzD hܕk]5iɍ]5)w잷]}]}^úoaշI#sWܕoc-`7:|swebwo~H~r.oHF;2yJӛ,WSr=(zw?#06O^FrxLnZ$htXܴHʎM7)ar?QdEp4&-U~@weH#rWM`G iYIkUrJPw"4-_Ӕo7˻u?~孏9JeڻUCdVEolŌ@1s^,RpZ]VYobz_3.p .[y;uW7tΚ.]~瘣GnRr~qqysWc[3!䥥AlX*|n;, >Sъ{`\]UE/o· .jmL(ghġ^ϙǜ!-:j0ہC(ф Bh)zPnuF55M G;W^Yk6TXEƊo@Z ڼ\zU0gr]lszbeG+Rq! ?ڤ"NMHςX)M_gM*gbП HBhbH5[QH^`  j*:gm(SMN^hL6FKzt\4ֱ}g~rb,Yc7<'s(胊Lݱ#"CeTwp?_;,S^qz1m||hqd9o:?s}/r{CNEԨ"Uȕ'ЈD>+% Qt"2AnP+]MyҔ'"կO]:dه1z|ΠU+ZG!ɰH㘗g5 NMPu&_C1ZnmL|pJ|]]>|p6b]G0zӄ,b^4.AhZ)ҙ8{`ۗnvN2JEʻ (QKaA(3B /1JbP*e2Ic&sapJ2qef+vy~'J+l|nwmvlOCltQB\n>ffWdrʡI0ISRkR6iGQ1/Aޥ`!K0nuH`c.I-Y'399 F:RXDV٬@N<{5X,m)oW X~=0hYz\^l!Q$uj8GhкAF)d6%F J}xNٯѪJۗamzk5M>k}l^}J~| &C[ID7s[/YsZI ffR `xnH) Ig姘<$Io@H-7^G@voe38[K~_s53bi0&W';fMU#kįԋH=8*ovE r9Ĵc/9gh,]ĀCf1@Kc B1{P;Z0fɩI%{ʥ9+)n\+ j#c5qй< V;bfXW,V8,žj۰dy|wv6x6n_9b-N.p͍"\k`KR3dȚV'2X6us*KY0z$Rؔ!IUa#d2,MʚRfyZlFl?;塠vٱ+j󺨽o@xk3j4`BF2effO H1qilݼYH!Y y4451`D9Ob8FǬ#*,ko!Wg3vF}2:T`EeD"֋shڳH0r7Mj_|)dK|ǝY6r(V:.#( *""LHrWAP&jlFď^'\lWGhYk슋OrxQ̝:\kE=cQ&FhP! 1+ J​CjcW<|c=@Xs/b[7C>S8$ጩiz5sl~ԡ{G7)f-mT'ߘ(cJ~FCNE^U2C_@?P4Slaϕ'k8C/Fk Q;FL,C:9'L2xpp N#-#>Yl{Le|Un‹"ɳqX}?Z .> .cd k!4 Y$(eLf` ˽W"԰v@ Bi< ,!sNFe2[p5ac9B&xT0#==AeiBVRfԈ`jIa8 V)) RWg30s_-wgwz[d_/=^[Tbs~Jek:5Y_98 dͽhCd L\$=-Ƃ0HLc"ĉ9b€|{=t7Ss',j3ۜ91r+4-/v&8rI =Fϝ`Y2V $c 3>H,J4pJzAy:JZ|s,YPM-ܠBN8$ВB#N/g0~5Ge(fwkWEʭ<`сǤ23ў#5 A;#=™(eFTh8>tbj\pvٖխE7i:AI@ 2 WNeJ9F$GE@ͦAM(ӤH4h+`Qº.&nLV+AF%69iNLzm) BZ mQZ: 4)FA\^ҌI@6V"bEH8 otLf"`JQW%sk,Hzc^q@#98m??֟…FIѯ?7nipgőPh_2s@j>>X5j&?^w]F47?"㳸~HލȂ 4]b|R:H@q0K!.ÑktϮ9Q:_4,9E* %RFC`qq7FxM9'r̠497n^ߦjƑM;u!].\"]Qy#߯~[סЫW5DhQت5)vFXu۵Y%%F F,=Z/5sQxStSa7œǓ|6~[iosd4N~, >yMn,MWmI[:kF7h3w'[&h(U|:9^ t{|<_9`[]uվfMmq5M?OgTM0Qc~GӅ?ɿ?ҞJ ͉-*OGa|qJOۓӏ﷯OtǷ'\ɛs״F`NQ=Y_&z47?FXijoմDEӂriWrO*vng̎)F?_zD\rjE<.$Һ Mj~Ye#,,y ŻH?a6^2@}>r}tM|{v6Rɢ͐WhIJ-s2qB'rx8FΆ5q6Z:#2U"d=bPY BNJ Jоdܓ: {:Ut*Ȩڜ1HȺAFblO9!և6Cmʼn,_ ֝[<<0`2KۻڣbOXS *! !@"m&>1e`'dmA{lq<Y|F@&YVLJ׾y@lZ.!XNZ*|#;r\D:n9 yFBM3'َ||t)|}3q wD \58ZЪzd8mUOz(ZҢ(!šա | Jz J߅ !OIMc*J[)%2{ҩdh8b/'LQK[Z5B *< ѾwϽmcA1+_ɷ+^hy*m\r}q',6"^旋na>K{wo>_റk)~;/=/}4/dZe+@t 2 w۫AWQАS)fxa!%H8 u0uɣ4yzg]Hsß^ iĔN B2=];0 f300@ UOTr61Nޣ7ٮuM[y؉ 5`rwlsի]P|k(rQ;IM.5= }N5|stޟmsO_O s9w'i߂ bĴa݇WrìG"[BEnPQC0ȧ9Z1[FOrnnv>{{>d\Jꞎ9iNchwLp6'~ |Efn7!n1 ytטWws5mLncdT|eoRlj?;wgm,4Y`)kJш {Kp|T6m{&h5`Ȳs`C+ju .h6љFm\V!bgm8rܮy<8uz[2j:s?"1{e%FY}5z b&zBc@s|ՙ`49!*@bТߑy7nRт#)u9.o{0*B\c8ViJٙCmu6T1 Fl] )-]j+9-JT4ƛQkO$[ {t~Dtɕ6ybM7b_.k`z)Dnնk[Y~%v7S^.5萵SJ@c@r>)oȳ*ZJH1 h% t7IrII$go 6OrVrI,F[s %1QV%7ָL6^ &Pbs6PC)>8GbSp\yNbvDyk琥CD@U6.$11*::怘Z9gE5h,4ku9Z,:@W>@OTZ*ÛX! 1V2ܨLPwWOxFxOH13eC(-⠠jMDhH|VG8h62p>˜S=qeg< E&ϑě{ޏHBRi_sWRqЪ[0tJ r>Ίɞu" tuQăZV9*\]em rh%ga'KN؀Msn3Xҁ?֜{Had{Z.=YqHM^($DT`ETWS_2V9)$ELr[ ۛeK Gˎ5AUĘ*GVUT`t5@&LIHCʄl1^R1-xyk l;fnq1@bEnbz BIRGRkbJȱF b7\BtIF/*4$*z!5Ar* CMP\TBmZ{cRTX|MFS8UdWkGbb /ys1r5{MbI8RRm6XS PIIVW,3(<جOe6!z>GjΠ2@w~t*%1f^@1\+AN d81dWTw'?ZV):mS[_3y'+Ir\q}~URf6|>~<;xJ*TcdBR>kWH(S$)&o +j1>I곪3P>i'BQc.XϨ'.d ȍ7m_^m8mꆙXjۜChKyrdhv-S_i/rt'g;Xy5}ι!aEaRQe[5.EȻEyT̕HLbWPrFSp/촳IkTEI֌݆s2N"8Rn O U<#s',r52ϫ?e!]]||aϬ[ S֨z,KzLŢlsE孎X\klan/6 k[S,M|v2R`o"*tOX \>.uk856\<mvu_}lIk,VX]`,( 6k&N-{ 0$X7UlGA0dAζx5\Tv vEGɁD\TP|apoK˖l4bqF454wzhGRD9 -3gj5&,V>A҃p :m'X!@ޖ TiUl(6"D/Ցk;orE;I/J 1& ڊUu I)xgC k\)|qҋЋ͎camxME_wF_Y]QqS8 C%{8ͧa+XeߑCmNMdbR6Z4EAEA'Dk(z|MϠ 0`@o"(JY˘*U}e,)!fҩoʣB(S,u^%Uۻu;+m#_yEt6:3Y,%9W>!˶ʺJ@"w8+R1C}aDzR0G{S{op@ԥ$FiY)mKrLF@kh};b~,r1Wq-rŀ#rCP{& cUŀ%T8R/X{)ӵS.XZYPS8Ҳbhִ4b i8voS;_!,UBw@ILͫcCW<c!#ǖG0 [0I$ * с3OA4 OARҝ yv EYH0@c4s< ӊIpݸ~6 h])^0с8b{`F2mt$%BS!`mL͹Pc ֤~/3+zZ4 2Em{v=@( "[Km($UwLC~j0a sa$82P` IA9$kŠZ+$T;tPm{"iՇd@3/CDSeYNJ<:Ao:1Mw |$Tr Usڡ`evӚFJI'%*͜Uej[՘MJ6QeC}历jQ5XT6 H" ujjS{M :,-ʼn)F=2T $eܧDn7^qiW B4 Nپ&5[Zka;P<Л=GBj 3GEǭv1ƴ8tPz(ā.{urk3 ^Wq6vpRC<sQUOJ}39W9v9gF4<0smCE 'n(}mtzfG=/Tʯx>_ռO6.bk b:z t2#0a?=Ur-ǸqJ1%U28RoY#X2J% |xQZ;6 nl-º[c$*8ʝ7\ Ø8P=)L>.Hȕ_oG15يJOܓ0((ef!gF >;GTa#xP"x`˂I-Ъu4\m?A O5wWB7T_u¸feoʭ'~3&B_%У,ˠg֎;pcex9:?^gzI0M}z^\  03py7(LH&`<${ͼ8g֨\AA!+A*W+1n+co P'-?zc1WoIelUlM&JKk@ƺ ڂ_wW.$[C_ܚl:˛l ɤL~TџCKWxZ(Qe;aq/LQq@luBh7?ٻ'ӿqa a}%P0ͦ/'E)}xs@5ٕYa+~]jn]𓬛"|6Za<g9v8 ~2̏~JaY Ev^rҝF0G ]tfJ[ޛֆyi]Z3=˙*M]ȭB*Rvο};ͿqK[EK*h[sF6㩆E׍6S ydaKfIF8CY8ECςDaLR$ m7 (WKM1jvHe)A.-DX /z|۩[#gńh3mG[z(L 0k__/DZ0iW09^%y:0-j4`tfUt&IUʰ/7_ރá8r=OiT)m{?0Ã/ĸag")TD[ 3âҌ;$$"RAP$@,]*{AܺX00YTn~?BR nQ0=EAsfL3j !oN?O ܁4"H9{[N Iw`:RriA).Z $X^AMoe0H(aP9 3facZ,G  OaoM~Z P0qbIq``nyIevnZ>7}HkVv`~T0VV^\r0`Vi2 3NUu$c&pa_:)(NzW WE-6`M@ZX.*,&7ŗ_` {:w8( Ś4gL6mK62)JkSOvӋѵ39V)n\R)hV(A48q vQoٵ>Ux?%7/.Ճy`v)3b. {gR}5s#`8xu9-N lpa!%t inFf Ha`(&+>=jwћ99ndSM}MYt !2>]CZ[Mwp6,O,{[*TO++ י΁x'_H<;/o_>{uz7/F` :+N mM V?ZiV^4Ul nՀ.k U3s3\W ĥX/uTbq$xsfnW_A/ LvYEA?nPW ]B(>m@Lfz0\m7e+։$ٻ?&#x>RJ^e0)5KLx H {qSa:HP"6` HI [P7RSTr؂8tZ٫Ćim7Rmvv>!w'tz:~ t; 6) _h9le~3#"S"y6NH"R"?"ZD :Hڴ_P(V,D% tx_^6[LVg)竔i _w|/4NpD `TxuI((pS2{#zFrT),ߵ2X d=7^m?4cMJJa| )zw|(ja|eѡKrC, -*ye-W )i ńhBo l]خ?\αF.Qs-9˝R 3qbӒxVVi7֌PGt6 |ɘ:챜 j1F~6 QIl7s6@8౾Us5(o䘭X9ѭ,f0!崋ӲT9&9f8夣 Djz-%#W<^]&5 Rr 5-jzg]9^ i&f̃uЌ e& kBy춠F4+zEbPtڬbaا}IOCo4z̫C|lPxmOW[ چ}սև^=g7ܻ?m3]ia ?_k{2cdΤ@%Z9)Q""Sd{is35&]EL#2V S|dB F9DBQŕ3Xy$]?5݀ɵPq ^.M?1Q@'JJ%Y:!qo4M`a2W _D) 5(o+LTmW*vYeR{l_dPSk\r? mLF#Rn* JqTQ( '( -@Z vYڧ7ރإ;;O3GKrߤKo y;a1}gRܕFf9~Lnx"XG'r|,nxV}wÁJnk9ݙwmm8U=$ݫ 2vYN2_PoPtU\%Bw? AEf'm}7܃.}xzikuw"=|ג̻8v{ü7qZY[ P2m\V8%K(J lQLoVOX//xE}[_]|u\g_ԥ.[ݻ} ; gW?^|+Râj3߼Yx;olFa*3{AX==ᣅIӗ*͎[8`)aδ8Jg2%R.: " ,{lO@Ą v"f-<j#k;&{h8y|^f$zܾbo%v;ꙹtj >./Js ) ZiZ[mHT+]Fj`S3BjKے KTU"1Q5q%N$UH"`)ytKQw =rqHwF]_.ۺ QgşhxF}w.yTnj Cs`\R~HYMS$F(Sی9ϖ;rO@Y~gXOU7[g5jOgϵsl ( @řf-G1>$`>)uV}6,fY5<I1>(" TO3)Ojm|JXsUSl5.y6g%EDTwu-%A/ȱob}ypOI/YS, \UF 3RsNTO1<Tnnu Ӑc# 'A?/Z{~y"2-L!߳A.v ";hYi]5eI$z&!aN6ydƧ.s:|\]ջi=-JI%!H2Yqm 0e#i2G ɜ2.))RRܜ2 `LajdO*hKJN_k*"Bj&qv1|O_P=LT5Mj%2k'rgas|{1;(RaPS^&Փ CpYB:xg"r\@VO N8SVۇPi &dFto+[(l)c)!4*t)(6 FE~TGfNv-B*[nY!:-A%4T_6nNRo4X*ÔKr-\܂5p { 輤<ڮ1ql-+IJHpefgEQ8JTrIL 2X-lڿpo걿9F@PrEopk R|>{_RۛK]^~.d>?{g_j߲Δ(U@C0RɵR@s7i_qaΧڊ*MlMHKi.i{}b'<}b:}bOsJIH P SRĒcPdA"ljC*BOO!Zl/̻z׮/e,QC#zZHL1߈b ) O;e\,>jzO@LoJlR_fz9˙^!ll(AIz糣UgP&K,†\Tz.gK5CsdɦZzB&6780q ^a9K:X^}[!^_y}m%UYl˳>} mo9mq_?-;Iz^Iz3z>|[蛛% SMXmDh}֏?uۇti5zz;{&}λPGN:rZk 0󣑻f|ww\|Ct{&\?\OJ'n%w<7CMcyyZ3=Gj쟟?_Cw{noF9Y 'e_O֦soӥtin8rC\m^S}Y&dBh@1/$Ҝr2e/(sD4_᎝*Ԙ"s(6L1IXSFX >)Ίol8'TP^bOBhPRsSiu) NՒX Lj&΁;eҥ~p.hI|tFLj̣k: m#YfȔSuƫJsmLQKf"+`I$,!`JP ɆƤk6m^E).`jQqj΅}VjŌ.d5L" G۶S`5v^)V~{@Xpgbo' L + Dv ň ]Ӹ ̹ጊfwdhSiNh $8`!;&Y٥0R.易,*J*V):k|PQX CEJc T &O@׸|F3( 1|t=j?z{(=inm3צNrUb+}J`VoHW߆#Or( 4#KQvDuMS E#.yKzB~3n7qxAZ]fin}84٦ۏvzFVa\0EMź֜#r5F.el5R$u(hJ\L$ٴj)MC%:ǦXcڌ~3K6 bo) YTclT))L\F7&<eϼ~?/ e#<7t/϶ v9츨/[ ])yeFlwĎuYMյD!j|Vԉ)3 g+Q+z,"d ~2RcSro#3hg`C&~CzNii{az{_㝀~XCQyFlERZE $S|\Ak drtd86f Hc6z3IVjEPZR)W chVVn Rq8qWGbzlC0XxLϝo s<ۢ6|W|uuk2bs=E :Xg1C>(`j""ӿ~m^g<æW7 `SPjR[GZôXԶcQغ~3j) ֦(P؀Pz=c4f=$l8߲FVcCn|LЂzWTjJtnT/dDIQ<&0qҩdYt`<D"nFwFBkTPOK,Tx/ >c}D֘q Z.E_I]MȮB qEG^T}Ԭ䏟Us03.xdl)dz@TeacWci .3.iDZxa8Oa{Ez-ȽgZ{븑_*L |Y$c!ҕG$b}ے,mm"VX9d5buuEpՏ 0y5/OocJ[xRœ󜴂|afkE$. |$P?@=QIcvCFaTdoRTf%jD܊&[dUM YC14e!NjcCqL9DB VWj'r[L {'Uh* S [tIe͟K;suf7p4}o$G#r Ig[}[ّ%5e׼m VؤlM2ٻܴHT/:.]I4Ϗ #W4T[j>J ^k P7 {_ T]"Oاw[~nv[ OLjWF˩x*LQ(Q%sڣ?yGkX6$"} }E+H}[ %0 ~ S=8VGG֑2耾CR0kЩJX۔r)!XKMh:Flt'/+WQIR7{=V98t'.}dwtnpe`ެN{{rIxv\N􏋋 `f?zrA(I9o@2Q;?y&I''099E<ܩq;8UêĬ(Us EPۿe߈[Bj\[P]vv/7#d2KEuICy}C;?tuiWf?dGn\ D>q^QrF/gzz|O;χ^]xv+i#amz-s^-ߞ (.R#-1hKlt8֌of. xq!>A]=-ޮ}ru3Vםkuٸiϻ׵-竣cF`YWzGώ?.K׼&{L߼9xg_\ jNIJ:0ϻ#-b*z3kpn|~J#~ѨU8'I&XMNN`h{KSQ5J\׏gO+GS]fٝ#o?u0x9g؁\oͧ7LY*+Zv PtaWu _x2WOÕY*qzxt~[}óz?q]/'mpVNp?) /ɜ"H[* 8>doI|SUÅnWd\0Ӄ{Z3d!6yf-/ t-_׺@ΖwNώ'໇fϾzLigYC؛٫Dwwfۣ㍨H:N'{,}z senOmV6)9\Cgǖ@q>q>02Q TV㋲DO7qѰRtxfS4\BT*t\=$9T(TW*դ0O%͔j@p˧Zُ~2.VWnx_N櫽.?&;.6fwƍ&G2ܗWQtmi(Z£> zl> CO ̙,YnlL|M\M}Zl oW9 _7ѹt0?_jQtYP߻v:9&I$޺EˁL7+ AKv0_7vp -{3K#naE9AmSftVYR^swPˈ>LA@|ć者9p` ^]#/;}ȱ&áIDƆV TdrdG*dls!i':VZU! EcV;@0SU˟VELQ*"5?mdrLL,(֥?퇒}49dJ5TTr)biUqI/1ekkZmW[,*-]L`7SUZ6PkrtxxK#ԚR!U\ . $eAgUh0-RoaK7P3c1NÚ1bZ+y8E䔔Ż~"`̑k}ɒj.M1t.ڣa6OpX2 FoD., Jk7'}vVߨP[7>:X!7h]!ب"1h雃$rU!6b_13dɺ3Gqw! T"U3%I[TE* F[5!ȵs2JK"ܤu5:)DmN8HHI?A)a@4&mG;SoIVŧ:drU [jK)"f;Xr ]I5n) ;*X{R]J y;"TF%|q U1aerJag`bbaR?5kr :85I r\ʃZ],"Ƞ/g$6|)NpN اwz$imvԶ ƾR=!nA`B*R5#bj5+H&D6ЄaFsOS ;bVa5@"z} j˺0r3xW ev[A AȂ@ ms_u?1[1JtFA9:*;E& \"00T W"'80)Йƅx65d&M;aq@PM:U 2 վps$`G 0ϓjvSq KF'E]% ήed0rTUo*BB99OA(D0P(2S eɸ;o iDNZUhV04d36`/!Jڍ3RYD|1&(X}?$է[EWmJ6@J@J ФtVp1'8$;Dž':.cЃZrF<@V lVLtq\Xi@a'eYND薤+Wq2pOd*P?Q\ 0 6YD HlDž5&}o\5_wyRJև,:$> \{#AKE&x1) 7DR^@1aZfx  BL)ڽFX1z,a3D-hgTǎf<D6^X% h'Ō6!XXn5;j0Z[f ΃ `8(@"}R8dW}63i)'J ?@Ao;k 8\-F_#, °n*) sU2B@:!Z_|fzϐNYvLkd2LR޺YPR.}魨#X6*,edIn$,jX&J`_筠5 {T_Sqa=ƁFn=r?-2m߶nb$Yg"PqtJN#0zR$5z-q`M(xwR-,FjMbZk Q& I=Fp>c. j^ Y>){`}.֋Ga Rg/%\;epI cVZ/tYN& *ݩḀy zE<<^zJȃ5x'%쇐Gg_C}n}so~|L?N:*FGMܿG?ZޡN[ذ{3o^{3t :ŝ([o/6RWb[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=[Ol=ֿ([opz0Xaܲ<z[5+i1S"+"+"+"+"+"+"+"+"+"+"+"e+|ԉS`#N|~2XWN +|=lW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEKW%;! tVBsɬpjuZY*F  KW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\p/p`pb Lڈ7?O0Ϻ6r 7yǜӢ;'6R_wJ|ah;> -a`ć} |ؽ[s<ܝ\<Bsu\VZP'$WhQ'#Wh"Wh/w Vr_\9Ŭ'$W`:S+VwVj{z{Ɣ\Wk^YgKG]r<+ݑyWreIMl:6(<|vϞzRiphtBZ39o?|lvMp5psB5;TReYYGU>Ri4WSici1Pr+kZ˕0U~r%4J\!*N'Fs?Bkˣ+iIr5[Pv|29 f~n k`o o|_gӴJL͹kN`)&_B*¿/$gIep 9dgpO+.2tQ|#?ỿ|-U{-ԧ?f3xiej-<*W!`ݽ(aJtr+C( 3E[tS.AiR}ϕƱ$0#!8yuJ ~훟 vu_?Y5[Avvz59{M޳|l:{}?wz ~dmGK80]=퉪۝=>1Lac AX1e߭-(33e6S}VQ{3g2PБ-s[^g<|ܤf#NVE͟nvq7;B)+6 "$6L%cdhf*Xif$w:YI!sJu,{Ulie2``2!ʦ׹QLqMW re^=+voV0XTqj~UmT<خEy-MY)7FUR `[4ZYpda`xYdrzXi|Bh2x\a%G hM," KpA Rqkg=͜zXӛO,%=~II -yhy9L;ƣ޷;,?ΖdgFCx$"DYt<Tfl8 OX YnU%R4%]P28.A+P H^D/ݘ%UiSqd2#ؠA <2X{𤌷Y$`{UQ,cAõ;.#qKKQj\)Jt>7sv $@ o:`+.;Y'/wU䤡[x֢='58MGX|9/Myº f.ex f SZ 'UU &#<2tJ)TeeٲѲJ;Yl:$!)[ <8~%Pw ++ɏzm4,{V(BE%rep1IByB4C0i674;]עc83˨g9E}Gv@3_Cq;=l6i_j&d;G|9z!wIqs ?{7MmAf2lN9ńuk=jrt6<߿-8ӫhyyjӥRg.Ef.qMlxÇ0Ѥί|1̿7YT LVm+))IfO}98Ҡt˨m.ϷO xoFb= E˵xY}G.v}mr5Ot])Xs]0nCԷnb4oh^3 < r){@xazNt577xGvƺO͝ {6v5٬X7>;km|JuwVw%| JYV4WNp[LF1˼EɒNOeRz2Z.^&<'.6ŶOXf[{nv,vyw;UG7Sx&\*e{;ucOo-b&0'mM<;}4qTF{VQ[5tsw'z3bTY%b^:K#NX>3OV )phXN;eJbqd)4K=R¿9*iV8 9'Se*@FupbJr0A)gSlIz=쁱(yy=;2}S/Iۯ_xZu }!(!K%%ta͝^=y>hTt$NP1B9+gXk:n ^v5~DVTT~c*bbA "k_F/Ux{ {\:a9H$d  Qj >g.KRpJ zԹLkU-$xK}W͜=IQ\.֤=Z2>IgԠi_Cg^3`iR !ҔjL'(/ʺL9>;<[}(zKr{lNsE KQ2 c2h*cNB.`T;4< ;^|^82_9,0nJ }=Xa82/N+LJ)8VYUȩJ_N"hg E*eL Z ^h:£_+WJ)߱ zL6cepl" ~!%x3YiJ,_ѡd^qeXphu=8sJ]z?֝91|׼YOяq"Ɯ;L_m|o^y1`g_z]I?p%y IR>3V&AS3!z?*~($ea67pθ@'dؤ8 cx|Ʒg%Yݶۖ,{.pg\}MȌ͏ks-\Q\=(O͢a O4IXN ?"R1;S^BlUaVZp7Yk/8ؙ-SS%qsj4|lwmm~هM}0|Ip6AN/}KVl+O IQĖ8ƀ쮞Vٷh-X s쾎'h|p[a/h2t^ݡ,oQJ;F#yax0JwX>46 `˼tq:9Xٺq~$Fk.GБoPF0Q1WK6N ek:U&~k;Oˣ0>9_w?oȸ}[8;p6+ŗ]"0NBow?ahSkho10М՜o2.ԜrǸߨz9n̎7 $οdzPw# *zoϿ2jM\ryT&^pbliGC%#o@52^ym|tM)Fb{L=kh:9FIe[Dbq8.`m3Hٰ0yў?sQ—+kIC btBCM0BK_rEQw:UdTmMܓdE]g-mzεI j숶VKr3?5G/9a؛5|E͋SyQf3ފrt<=J`p5N8K֛kx./^)1A ?(.V'XBvΛ^tm^&/owp#Dm[y~|MGmW~iJ,Xws?#"'TNȆˈN`bqT`P{cºHI, F͆uwl>E7M_qjj}Ql 1;ڎ1 xf4yb,רȨLXg0*O@q!m"j aZ&#DNxl=C rݔӉy1 6QX>iWb)jB6*Z$&1R5q6(qq =t5SGVr|T1`+onQ c8?8Z.U\tt$ lzэyurbG@@Zr? (I`1Bnޒ^^'3mPR)5~7Kw.ZܹSb[a:.\@KkLI*β$K]眙,KZE{f>&R2Aa]\D|+ӨF_dt>!/r4r>u^JףooOIn튻[r{9]]z-b> $;_1urzEKrIЙڥr ě߶UގbrQű;-f 35Z^#.lD E .{Wq^l/nzwz ;fGGC]+ڷv@Z;CZF=bM "߭EdEd0 /ڇgHOΊCwf<8 aA8#0"x9{;DDXxlZ<1u41'w!G KSR" uB+.-f)}yk30!+8 3b̽õ*^jSGs;--➉:vJ*/Lg;bBHmG?Ef9xՖRp$w87lNZ0F>%&0$wI{W?~T?`;(h42 C֞',;XnR-e)sD>9S.<זfmd8ΔL>2l`&YB(>)+9M%kme(XM ]S{c=CZϵ,6+4|J-ubc㶯6}ݟ:s=N03*i_3*Θ**},+U \' X ,:4PFT{,?SlӀpje%ÂN SC2P.)^U<!*'J#xK\1%xpZ0S)U`-qnGņ)ѥ58{߄,/3n,voI}*;xĶBJ(&$M%TUosJiH M9 g4鼧JTD[J*AM4fhˤ9$G)d?.hvBs rbJS[nuNECirEjb!s܃G7 ){\$KԪ' qҵ}-geAx*.rBf̫HP*Yg1fӁ1ZѫB ) VF;"@F o)em*53Jq\XQǜގVJܑg}QjKoYv)\f4XJ@*'ϥ""(yur=]Ơۨ0=vT[!YY;s=ř_L,W:&iBMBqHɂ0MvD6Veh2U()d>y&`$G%C\%a))=b()[%DG,aTgxP9OA`*4-_DDTY(Cx`昒1TS%F9Ss;N]ԉ=ŊxՍmL}ApAӔ#buiF'esG NC:^; Ƞ)) ЕxD~CFg(sh8=VIg&3k rn^jz涑TDmKE#S˖貌8oe)Ydv1 IǼA3&c䨷"p{wB +*4X* #T8Xt4Ri@~еg}l*7e97mJ \aeyQzZ6j ''N X-`ŝLC"J]KiOct/;DdC[_[gCvo/G{ÕjGJGzqڗDEڕ<=MR jD0QRZZ=Y"M'3]!c\ؔ'!P&;3)# Cu?Li-{Gk1=~$3Za'+55JFeK$L %Qe0GUr&&W,t&A([kHHq?!]˥3㲾Lmw| "b@կG}|~_!S_a2&>0 ǿVjz/{/d9#Rh\8n5u&>[FnClJ#T܅CsuY.s ̢(^9Fsĉ(@ ?PnCp2`PꌲZ#|`ra-Q@c|h,Ucw܋X& 3,1X\eƭ8 -3@sʱ{5q6O t)@s:9 l]!w~vضYlO Zcf<$gM \R:24&{cXyT5)hBQdnd̓J1ec. -Y' ֜ˠCʁNdUM5F9o)P;"DZivx(roxE\]<"1tmsUt_UJiV+٘+0z *⊽1WEZ}7WEJ9c솏޾`+F![2Lѻ |b>0RbWFmnbߑ1Ԉ.Kj,z]*(mh&O.;*OK\ N@sG̗t:3M" ޮG]+HHK'ub?&v427}8 7Sؔ7H%]1o{C?QPq=~7/dN`c$1KoIdzF)xxțjDɝ(hyfVî+jLWmD 8u1Nu9ze*0;&V;&>etGݤd_C$gHi͘#sUtoUa}<g\)34W\[\1WE\]n#9f2/ y؝],0nc].+\]=`Jj-˴- VR 2xy-Z tZ74xgոgN{t-hwu?ܨ6s޺(M]o:0UmgL>T-'la0Iip xǗۿk+᷋ ֎/O~W8F~8|.K߭|$)c]426l2Y2Fkjn+I܀ۋd~kIE[۷.'|qO:wa|cީ{Em5ةW3޲L>Vkx[}}Lv9.pnr]v9)]Kuv9.pn.p=Ӂ]su9.pnrR=RiId'@]Kv}ɮ/%d &%;ˮ/%dח]_Kv}ɮ/%]_]_Kv}ɮ/%dח]_Kv}ɞYYԞe nON/t3 Tzm(nL8fr@ Q$)Ӥ;Dqh)$ è.$ Sb1 YKVWR.JYP:(uA#+^T-2dZPٔ&00p663~9Q9/?^^W<|w*|2Ss_\0v5JLVq b:^}!j*6ףR0Vk9tfjΌQR}zͷ=F(APHHKQ Ncsn܂8K9O'1(aT\硳Y I>[d H%ѶpC7*ciHDVd|]aktv!HF*3g%z9'5ɕiBriɓ6mu@YAa &[T r6fG^ 8Bm+7ώ %.x)6HEfS.1PZ!"xMJ+䎜Ն*@*hAbs&kL;Ϛ~!ٞ[NwtT>|($e!Ft&%+Qݢ(R !RL%#] \}+w/u9-* C$\M<: Yo9p(2w j˳'<OO/RADLlFIhD$AHg9g(z&p,始GI6m 'n|m$wogx12yNo6%D߄#fDx-`ǜ 9m$XdMNRqǪhՌ8Lx齀xFT<WuOPYvɨ\q)M AeD =T~RRwvq Ʈ @;}{IO:F6M5O_c@n]k^6:OӲփpY,CA% )Q'[ xHd ֑9GvXY,l3{ۛBc[ϗ*2X__ܑveW XfN3G?gG H`Q D a69THۮk c^]\!(jԜ)pn'+ \gJEDŊZsbJn;ڲmޗ#G`sŢ5d Xe FNȔ c:-m`1 GQ`bPe\%fǤ~vJq~z0U7"Q#❳XXamq'u=bEBG^zmNl-q"8#0(;K3!hT$!5T+sg~ՁborPszɾqq{\SHHJͮ$ wkg} B$6P,*0V{\|x(Y76.L=  5{ߔ]Epоqvuxv̿β$S5c#4T 'rLv(7Yxw3^?ϧ>f j,ZSN=T/( s~$ΐ ar%8P z%MPgQ$+NA[JhS쇶qv&UNju7\|8VQvɳ/M_<Xi+) 61Ur|%8XKD6P9~$k%E8LIU̜>B =liJHtf't6nqiO[T%Wί&jX#,4TPIl*+xp%y,VHB2pIDNbb3zz O14nEhEA²1: ZbH`(S #OSẏgߟl5;8_8rzGB~YE2;{O.ʅR;K^.DORwo7'6]G P}F=Nr`d#"W^5 xŧAH tĔU)Qf|Ԙe0Z}B(}JK+(фZ'#'J[!T $$g\ .ir4^NGr%8K+<`tT;ߪ[wV(⢑G\r%+Zʟr9Bׇ0G׏GO]}y_6'q,W}oq|z^fy8ο,k8-j qՆ/> b-iaoiBv D' 6m0S6%lioi}xP0ja:ߍ_po9.jBy8gS %yj;iӣx@q:ϟk ].rۅ?~3;iofOkV'OD;ٷ5N+XZOέlUϧ`Y|Za[IMEoC4z5Շ[dbzq`8۫i `) fM>KRL QOa6"@K6Xۃ^W{`,'A)5Ѷ-.GܮyLHy\.wttzw/?>OWo)=l/޽g/SG>n4"s/Չ 36]z>`UxǶv;{!Ih?N3jJhyT j[lJFtOI I Q6z)[wwSX9<%$(@$*7k(Ife5 ^ ,dڤh:'L`hhITJ$ t.B۠5frմʄDNDj̜q)h1w\-9)O85>|%km+7E@ː^M͢bm?kD;p.:G2g3e^v:1&WM_1,KiYUĒI2XM9yGSJ?z06wRL_A7V>"T:at(IhΠ0rpт qnH##EHZHkP4q?a ~7=nyK1Msݣ@b:dhʦHYBic`Y!?n?Ҵ"D2[gP-% 4)8X-%͘dǪ:EjWrGFd&itIs drķg$!rp@=j^~G(=z{BqiwV){r{PVBT&|krr`d`^q/Oڛq_}]q40]‰TvPouhP?b!>Wiތ(2%.cvqdNp2%xz=%4rX.12L-׶ ~83ӚaDf0ޝ:.zcW^]=>>>eW C7Kð[T[MvZXlY։%i%s]k^zZ[T?7O^O?>xuiBS?̃MJ{ٷW6GziA.:w/55Z^jjYY8ᨶOԃ 7bѓ:ƙՍl}FuU[KvQ샧 N4}2: h2 Qsuyol{["k 4'~i ǽW|W|ޝp'ɻ7/ilVD"|?ܪ~yaoճߦ^h+꽥>ڙ\+{?y9P/}ljG/au{^CX{#̊ ,ҀXdaGV%o@kljBdS)u4U>xOG*y`1$PeN&nPI0r\ q%C|l֞qaYE$'_LE+G AHbtRJdPK=Cwʙ^6'&"dl"*tדb=?ǍFp% y4JEΌ_tIPws Ճ^pP⠉)8(oy(\?FعoG|)*ߟ}?ibI;"p٧^ԒiFd]Kϧ@Vֳz&"fq.M g!1AL%M"+蓫-!=[W~YߚN^t˟2fcmf]ǽG_Kw)x:Rr^|O{M %}gz~64*z}}ޜ?2=r{7ݨafc_Qg ݇޲N.ľ0&MHKC5YEWମ*. `K$%i'v=$;]'$+7p{lv\Yhཽv8Y2&P CrX DI;/*O`;uffG׈,F|$s,qw!GM+ǣR<&xbvK@H\(9"J¢`3p!k sKb,7+. f|kMͻ7>1te6|`s1$#,_;HT.h*(8wh4|ׄ8TÝɋd'?0Ul$s_՗{/$>Mi1ڛa/@Ԍ`mfa֨{Iы6]W}#3!)yͳjd%Oje3&H5bΨTf} 1:g#BV!]`Ff)~uQ;c91%Dx A+QeHך8!ݜe<ŞgN?$mIJ+imv*H!r-~Z0zP:##=LXZ׿ A:MMQ*cP@KבHn!;2^dkȤz#pD]KMR,TLĜ20+/< 1G̓+Mv.[QU6xZ`zH)WJu @|$ǼA8l}$+4Tlъ#0bc{c%짐mvtmOfl Z- i;*A@jz66m4^{_eDY©d8՝~g7ȗnY ̺fʟwDD XDيTfqH Hqv?65-&.M9*B bcNB.vtN `[:'ߡ,`0g4)dBlj.qL) %s,-uXgJYꚅާ@;s-5Y7hB@- JӺG1 bT7EO89=m\Ml o}(X+h%-[MK/;x=Y iLK!lQ$0;&íθ=&xesܵ /MJ.%@M)z'H b&s#w;+U^-,,r8=7!W]}=|g|Ԗz.ؑ7LiUx\Y@lPl޾xZIKg϶rhBdWWcWKqrHbo㭾gCoWvvռxdCw{+_ߛufEoVٺ}Vɗ/o.|EW\?F5͟l6s~4ۅr\5_Pptt4?׍V /YLynƲXzCi:䛣D܅gb3 &[UUȂp*fV&UW9p(T> UlߎB*䌈rmƋ$SVxA&btV+'< v4,E-s) efDLO+|&-t'*qM1_l~cZ8+c;wsӝ<ιKk}|@-_ɲ`T *ƲƖO-PE*AT!V6VY ,!Hk{:(tu'V$#{nx6q(I2n\Bp3pXG&^ CnQx4$E5I# e aD˹ :Ϣr S prޘ筹sipgG1q%äXL'IьPIg+sH%SYŁ~ʴXv)e&r0AXK u"+S2Yٲ&ΖzVfw$'úJ%Qht0%VLjJ'$Θm:`ӊ!w*Ъ N?,[J:: rEfaVBqfƒe@({Oj j:?3;yA[t9(Cᕍ"2Ȝ{TWu|6X磵c8&iz>X˿,ƒ̉k زt RI-χp:s*Hjjx3jGDHH8J)+579d%"4 Z5c; lyiWT,ձ4Г}Q֦]ZV>)6 kwp`q: >pPKsBIz 0bhmU;Tڡ&, V3-+UʂbϓX5H袻) g@[R)ٻ6$Ugo;R[q::M`Y,pE*$e[9w!E0l3驪Uu=̶#nS0hxfTs'mg0OerkuV|"ٵFo+7WY=xug@aE r "qC 403CT@>̄< R9KVPY .Q,Ϝ3f2r6SO ۏ;vn7ݺ!{IL? -Ѳ7Vp֓i&@TC-@ڱk=iȾ7Z[\JKHKw&;@w E؁&5CJ+=aTf >NE*ݵwQٻ6?.y !ER}΀,TEGg7XN&IQh … `S*d|IsYSyp($Z h_XPƉq.ZOyE.K %F8ܝ:ݿWXwjH%xˇZeiJC !9G 8)E d@1蔄UR[6*sf,Ub.\.UD.ܪ.\i:ȸ^\Wr647/~{Cm,=MXO%Uh`ᯞH6rI>IUBnˆzs͓EVq9YV-_Hhm& xЅ5v1rn< K&hbܱ֦eZ[AԘVL+ c eR:9̓aO"fʔ͙K2 hȐ<# x"^!pDX `KG#n}ЬwF#M5"+^#5^,J:Do8xi $}:;޽>qɤK-I69Q DITz@ (@g΋rKsF|U;ԋӺuMb %zqoxpAD0X)GQ@68B T|JOq׋ЋqǦpN0'm *lѷtϚ=r ; 6amٻяO%3JzpPԒ%S ?i]p}(v.m9 ީ\"ȇ4YA O/h;*Z d0cPQ΃DD<)h-Cәs-[bkqOLνMi:fh{.Á=Oix>w{5;?G]ekM,:' YZØF؊Fȼ{Q $$!Q#ZIH C@# 4uO,cѦ7kNOP^+4E$O֨:>h5I҈яD!J8.ZFн)y2\~xUЦiUВ8|b^g.GaAgpʍ˄y9?PbRX }B] EԽ@Qin&o" J< pwcgp5 N8J=0 2%Ө\"3xe(s|7 MC,D<ϸ@ x2{ 2TsƓCiDލC?X; U9~ĭ&HV4k)PsR3p ?Wi2> wgW/Zxi{<pV~({('gq2=_X&TFql2G89Ս.b}<^"jGc4 ȆÁ+Cx=9ZmDy|T[&݄杲n%9QSl ?GC;4~UNa Q"XnlT hԓ6h ͼҚDb$z#ޣQ1=ݱNj)t U(P<]6ؘ$ L EPwtD{3>)  M^Qz}:w t1xmU7r<~c(1/6K/0ՋOs쎎%*}k/.E> 1'# ibx.;j/&aD-fΓAF%l8Ch(S#enTэ5[ 9פ8jm_KMT фx& n]<X̏d TA5vWuT~|e4vʻ|_ d4mNP釪T\Ll~*ګ]޺à _LOƟP/O׋os3h??M~DᄘZk-gV0a_8 k/jܨ&=҃l JIm%Υ4Zbɨ2#A4I#ZĽ-ZE8ɮ] ݸs9܎X(+zs6 wOS$dMY0RN#"pc:!篖#.¾i-k^ˍ(܂oqF7 ZrjZY*QBX vQ)!P7p8;ΣQ22DYE B8p+IZSF 1PKwßCKx'NBdiWd)rH `i5@',VuaŏY,mWoѠy]cI|-nn{tDHEttIq_F˻X y8fx_3?bsPUEnt~z\J߫o?FQOPHk__?FE)Z}/7\IP1# ;zeŸN ^ gjY-2[cC'zsV#& _6T塈 \е@W;K hs㭔606o$-z*Zww6QH>'cQZ-)4eu~aR8)#!4LP#e.Npj2s >ʔ `h ҍf6!%3|^kZVK.pY Vf^ubI!P$^Z"^̢}ēB'@Y]p+ɗx?bP.-z%g"hOSaEFH- BT\f 4 + C!Ql9w 뚉,Fėםf_!Ry4R) $jn'<=? O/G~폵5#H{Od<0ǭ#'|LAE91j\p{A*+H$%*D` m *xo+Z&o:JQVusR"rD>>[Un3#:1˵2CO󷦇 GU@f4E?[s*&1eW5<--h.z?8*iGv=bk8 q8$;8'ǀ2D?i,^ǸkVN6Jos룣QvM55uN9G@+fգMoO'`TSKj噺Z_핷ͅWkb b( 'rsKΑj4x1kԯOh8z)Ip;7t7 FA48ѸOm);ƣ|Wsg?d߬}Ϛ5Y2C~_~|G}W W`:+>(y0 ??oԺ8|jS3Z_g^Q{J| W6/F8Px(|7E~9MМx4E&Ză ܷTm56q0-kT͕gQ!}Ӆ KBb-xU4q/>gl#6t43 uB+i(lC,(AZ7aLj0Wyu#=aNj %]"CKA$擠Nk/`,\5r@LșN?*?blnB6=rB^NCp`>liv Ka;uW5eY)"l7xqŽO]I:x VZ|hGF#;RB7ٻ޶$W 6'KUWl . eF_cŲegAVDdK"c[lU}u=1P$Y{0)c,¡HiվH+I_E$ZovfUZm̂ˤmSeRZ n̦~l~~S&.ƷӓTn;"n<-K/4"K(.y) >w 0|pGXIvrBuk+H^ eS/.*TOXq|422r;]v9jQ!:;_: 8;foXt [|G/))HqGY,&?6G"ǂ#O]n?}kQ urqrq3._,-K ~T.7mE|GohdqC1WMZ}7WMJRz*J7qbJ΍%+fGz>X1Ӏ1Ei)jo=|z.۔~y7V942}p,qpbP[󊜯u;s`̡ܬfAskA- _fo_|3s}7=m߽^-W~jb]Ϙ3aY ,ﲐ%-.,Ǜ7~NdyC #.Gjav>e Mx^')a7 ΚݻFahC;uکðxi-n*eTA&#Jɵ9[. m=$=Ӱ:B!r5XUy HJd0k9SrvVv_zð-Zn0;ep!>Ȧ0"4OJq-ȣfp閻dy\d}t51W^ivba %/PS b2$gEIvM@$LBv%eOYKaV gT M%d;{vg_I! #CkV3&kbL*K͖gUebau*C葆Bes2jh`gND5 "+ @?V:BILjԱ1/17BĨ\** qLZVԂ,\H%WyKŒ``Pi  !)Z5Kױj"[>HAEo 1cc+v dE:QH.,yD~ϠkyQ_78eq`3t;?;`? Xs0AM\0ԤŽwnR:^C3΅ILV48K2$./N{T_PjHۛ9m^+d h5uP*@dm#`zvk;#ֲ T '@qbB=@FD5d:dŠX PUPedTQ2I~ 9RY zc7mжWT mdʴ1 b'oCNٯe/mܾ>F lȫlM SfgevMUx/}ƅt~mc!W69Oԏg 'ߊjxKI i=z^ݽ dM7m=Gsڡ0wjx?.r|YzK]׷tS66OG~|nE_dn#7&nNCz+|nn\_{n~IMR5k4C(`3?=w{m$4Yy11dFtldV 7N%B'Yrw&XN`BVs%T[0j `Dh[4g{7q61.TV'B|~ӛfh S‡6 8) ȵ]f![eeJ"P㖟GfAEi uD}x5W|ŊhcjXt 41 T5;A@/qk8+F(k_#s\3*W򙘥o٦3˩)fŀ/?| Sůk ir~B8@H1lP#| f)$}GIw!i*E#Pb+9π+r?: &2jtԵ*cvD|sK((N OZ h,(1Cba pv*vg==eup~]8v y9;_YvӐS|vUb) `n}ÄW6Bhrj2J*l)pS!4)c^N*{r`4礆oOcs{w`R)&)łthyX)ə@\Ae1!ҨMoYH -F'cƐ)൦q ? L7qfC)'뒓O֗Z-!e[:]/-_0Fc@ϩx B賨j Q9D2)@l\_z(21h*W FQ42X\%Y2ѣ "K畖\4C !e*oyYS2W.ع( z𶳞ugC=f[_W`\1QdV5Aڻ1shQjh,4kuZVuT*ksC~/ө$[Ure`㬋IVĔ9+!2`)GPjuZtI#Oz'g'WobL)&9Ԅ(ʬj]V=Ĉ uV8%5r>cR>\h6f|(̉;i>CI>W[x8 i&%7oXF;"@6TXJ%o*+YX )SA - 4jTe_[Ld_V\6rťPJ^QpE( dr:1iSA++(hc̓G7`&>hGv`g aN&ôMb=bBS F߸g3idۚ:lR} x X8_;|tds%~rr|M-A<+{}+Żi-Bg'Wi|7@Y u)gRU٨Vx-i>5-LtJN:$c/1,>YlK'B5pgVRF=k_MB) Ek;y{3NDgq]Pbfil]Uc.ii/P[ULщ Gr7fSQ^ٔѬ芆1{KL>O_|x8qUbI.;'/zJŴLIu}>@6bE-LHJvNV@vէF^Lҩ@%!#@#YQ/-\{X*^a}86O<\U\rJ*jǪU^𣏕$qJEaP^%[ulZ+bv^'6dRQ$XyL7,1՜k1"Z-_ %@JZ!PN;PHYoan.5b h )s!%{V(|y`翝<}'-4j &XTljƖ[b@۲ ^+1kٴ|H T)Հ/]MDܻNm/q[x~(Xnc[ZmV X}edGvAUfm[)QdKl A0uo\Ŧ| Cd[X^䢒3Σ(6J1WUmMpJBak~lkMghF8ZĕXX#aTDB \m',Z4)`"[t p :Ҷ% X!w^⬷-ui˽5K#Ўvqf< DY[Qu$]eZ{9*m1GJR;U|pav1ieL# n~|&GD@To:bSHcݻE/A~zl5<*(( f}^ 6c0pJ5c u6Ef΃>u~qJWV{p~Q~b'Nꜣc4UUM&'rt"Qllٻ6$UgoqupI`Y,p ݯzf8$E %(NOwuOuja[d4 ֝D8DH|z; ȳeTIǩ`r^*AE@AI4*=\wXVb=FJO;g2]N4N|u' tByBk1 )lP[L QVN"y\7nb]C.ZA+̒\rxЁ8bldLH$!J(ZIC:ѹc a uZ4}G!]D= naZ^jC \7\0Ÿ[f1v0$)QA HP(4 LN?\XMNy1 < (dEM8$kŠZ+$T;5 &??޻_(KxRSoE`γifeb|MR2q@߳8d|;,}uwS߯P q^d: Ӣ>WL я&p mE>|J?cXΕ*` Q4&6|7zXVڤljmUI|WޠL9}*a4hMCH\ӽFv/#~0 @z4A<%'-_>\ؑ܋MK]ְ4uɵiK^M=Ekz:Λj$9Uޔ4Yؒ.H[X3rKo3vգg߅җ;+6!#sd]̦կ4J,}o2_ ~ k&tӸ|^뷮)@q~t3nY?s1,DY}KF)/@|_/fV܎]ڿulØd%,l#/}f82ғGfrEK"w8^ywoŢcj q5xp; u _`Ew|+ μ1Ғ.d!~H*կVc@m}fۼ6HӸxnl>[r .,eB@eD%Ӭ6ޠTMdw2SliF 8PUQD-)LS)<2 rUMBQm\̓wh6.vx33t𱆒 u#T Cnw-:6~[QՓ-vhn 4]Ŗj|7Ϥ(,:~3D"2'.b%( \ C_/CX)t3!N)Oa͠]:3? ^ \>ަG_=ƓO?}{Xᓁ] .[>.& ŚϤ'H\%ZzvW*Ve[Ex{{'0bU SPofT1E?s Dg'{`Ϋ˭VN*R1+Lv&&8n\4ﶞ9A6k.9{|_Hֆ(Xbr3uMfӚHtC~ؒ}cY/Cg^.HRA,vđs "%#bJ[&NG1 E7@;NH҂03,*͸u c ;"APvdbIX20E'@fv߬{FzJ^(x!% L)JpfoC4(w/Y/c=@H ]5]HC)0^18ꅡkJ`pUtb !frSւ\ 9pe0z8_<'&s28|*Sp]"P%ggzUvD*R \K:5zJ͖So#L"Lgks[gUdZ?K/ՕW7׫>1RRMČ8R\H6חfXad5(k#!8*]xR'X* vrh069<9:dӨMsMŨE) |?٤@0d%q=M*CoTU::˙\_;xo>_~xLԇ߾+8q+0|M/#={COZ547*9z9qmrR\n Jϳn>K^`GlqjFbHw&Y"7CX0p/[X^r+ln%9 ,z bÖ́=u6t; pj^Z7 c/<Ѥ'RӮ՜Rn\Qܦ, :2,"1rb dH?"Ph;[k춚ߕB@)mx]k1!#gJ`3AƵC@32uF:#SĂwFbuF:#SgdL #R r_ihPA>L+| k$9Zs1yj9Z#e[;`8F,"uPCTԊT4<6k4[d"$`*!DUX 5P띱Vc& 5Z ihFΞAgl>%whlz[6M4Uv 5ZG ,DѲτcvG)דY٧7[I+&*FI@hw+U ń!z**yz{bG6;r ,;9knݿm{eݝa{r;?L'-ͯn^y~؁{FU&RqK2- ֜OtH:mRY#۵3xK!tHZϤK]giW77y-o'|_KeWmԫ)oE"=Z ]n1IʖKwם+v,]N w%GZQex aI a)D Gx%ݢЄK*Ljd<)VM-a+Cʘt˺{k쩻Oqtkݿ>Dܟv9hv xss3ڬD̷Vfw:(,Ì2:D%97&Dlž!KPXn}@q-Y4*"h]mo9+xwR"0vY$O+9dWd[H~qi AͮzXJ@nTt%j"cTUy87-}:]jkTp^1e/Y\]k(61_Oφ_;Fק*x*ԠB>O f)$IIQLtJ2 Qa4SF']26p2P Y= FIE1%eY;!.Q Ot&ynqyD{ݝiDcno!7b_\~K[M4im٭Kb&2|ؿJQ>ϝ\380hC Z \F@ \l)FyDk(@_L$_AeUA!rP˵+T]i+8\MN%SP*C@КB GVKyQ6 SzEAӺU,oDëQ=Md@mR|6Etţ1L\EEk1L j),NL*#S_Hde۝"xl%O)jULhdlK42eb@ W[w%-k\PZhv`(Dl*zPR;dBvn<2'BLHG"j6GbTHVrh jh,4jukZfu !pV ԏyNErUreXq%*ڄ0N:iNXuҼT"edBEB86k%sА&.J-Sf:~Iaݣ ZQӷr/yr 5[ߴM>&B9Se<21oXO6EaXفu3U,֞ d2xWմL*!=3u> Y%NNB8A#˹8e^)^}rEä]ds,euyzˠJ*s})fbJ҉r Ɇ"ل2#U,S@Q0*D@ \z+?n? k.$HXK0';j7s} 5, #1= )=6ĎG!BӵƟQixpM\VTY&cZH٢0Vڮpyf6d(?^o?`\SRL>WPTŐ3Hq*ℋρcnq,vnG6C}nv@|ZϑR kW\)XW;>n~|Gd0\\ ;`(:9u(zSa@S,?TŨl@i*z>my𼛚߃E IC RrTڌѨ3dEUKFRgk9WFI6;3Y)X |W D0% 3>+XU5t3Ey8_.~T֩cW1?M+osDVv}e#\uIc4KS֦V\{(lYRAT#j0!³ B 3OP֒J2ZYq&P ZtU\=O:E_LYg+0hjg9KJNbTkgIUZㄾH%MZl{r7sӄ@~@Q|<]Gw.޷9aZH'.\pUzvTQ'k|B2hSN2*2Ee pr,I Oxˋy3!ϛ<x9{^U P{|d5 Ɍ %tAF#ϛ߿@ۮ@p0rGByŰIE@$*jJ%S4uHLSnl>_mjq f?zF2Iwfw6Pl FvK=@:,oS9AҚɀeŴ6%uq&l\1vQ3Zm` Ɵǔ 6P٥VHxcA, UqJL k.ަH Ǘ7yoOpϲLF?[Y+p4lYb]byyj=__gu4{zszNjO7Nxr /ɦ ɥ>[SYNoZ&U+8٧A$}%fo._ϛ~cԮ ms7i-tױa#їֻx~ZXvQ.ZO8?t :q?r;8&%,D-k,̽vś){?6}fOzc^.wF:I+;kmnH/{6=R*u&8q)qEZ~{((Nʖ8 臷x Rt'7t-i77WN+OZ JД % HF+2 H݌Lr7[z2UMșN|a27-Qxx}%w*;4ӹxJ3Nx0"T8N@( R͸GpREiNmR"gBǙsgՎ1enx`HzgH,hC)_W.y:$dw~:06~xynk3^x~eXn^⦳8J*s|Mc!F#ApQ+Gk$+i9#kb|IٹmC OT)JSŽJJcCY4۠X3$jf'•zϙ?ִB Y2*8($Zx:rP d>9rR iַ%40̭ m6( DD<'{-h:cFBX=t,愡?E.ܭ2{;ΰC@3sB"hz9e0}H*rb/9jln9NCE ?P0,%!ge״h2 @v t(\CN^Ogqhfl$ZZq`=!HtByt_7\gttReѬJ@~0~ۜhQTyUqad*h[s fL_)odD-_'ZRï/m}in4:Fz_ܸ>8^6 3l!̹<䴾\%H1~ >:9>'e&xL\?y4}ZYVwC+|;X'0 7zz=|0{TQlZf14|Ty}>OIb WŸ>ǝ: udQ=\)~݇}]ݛ׷|P޽};Οbo"|L0jjo>5|)ty _e^%{ghN/j1Qv0<~zQ%.vMdălpeܷuI)9ŝ,!\?h>m7/ ,ϖ iM<#vKWΑr-~BIu\KalC,HAX7&@0i9s;WV]t8xL妤H%MxjAiyYeifJܯ"пN%tk&Pm ]&ovv>҇Cyn]tǬv^&lQdӔL 0V/~~oPڏm8?M^ܖo>80] j5q}]OGշ*dQyxy2'?hqQ wfѴƲC8!."!xDjcq5\..]>q"Ͷ{Y3BܚwcЮzb|),u`ʊy*պ䜈EmK'Z wouop1E58%"UYA-sxTp<⇔n Fp2.hCL$`!1wIt`s F 7 Z] 3qVh_| t&qKE\ )',+n^fMF<=5Z2+j{nB܈ͶxQ6چ,phx.LyIVH2JS:LpqSb_$m#+= @Pfƴ4jjgboRrNo`lA{*^g=; Fɕ0e V()5!gZ9xS R0cb)hO+bn?_OcqMq_ ],rM4O' ~QvN؇߽gl44j ɲRqOkVzdFceȵbM=jeKvs}N1xycJ/3ժܶe o7^38(.) ,gq|-~,G)_V|O(D8\=M`f^&n&in{/hIn<Jw%pٛr+ߛFw|X/Y%HE{ PCŠS7d< x7|URSuZ_{V\(`XgԽQp q,}TڴS)`<0'#4~xZl N&~M.FrxY7}wRNJ!ـ xUb11M1=Tu|4Y_(l}s+Hs%FƆTHTVi/da!$ =/gaHcRv뼅k*/S6yʂH?R\HKH(>ךZ VQ@r3,毆eqgr(-grYJjz&29J * *kk+,%B5ak\]rZ*K+URBD%J 2m" ;fJzzt]<ꕒĔ&S{1 Ka\]|qr8z,2\U"iXJ8sʼ%@e&/-xdL*@f;e8PNșnF|6aEw **xb?S:ģXS*䒕21C=m3G/tVQ?jc)+L19{EfK^Y}Yt͂,YWi0tr~9҆Z9 k}L,Aމv!$b UKG/,0e";N[Gnv;QP[GQ[sI)k{锴&W)b>$.T¼϶8!im*FjcNf.x#DB0*KlrǸN*Gelj˝b!{RVnJ4tޤ06h %j3G*EEX$CkbpX=ړiP$J!$H Zk A1蔸U1H[6!Hu82vgtiƚX(HXhz,*.$6̖qKizQ~]j7Ot#68 z "< _=J$b`D3beOSf%!=(aQF3l2d%AQTBnGIȍL>&;qvQXڝiǺ ݢ{ A7&i)wQ#!8IheR:9+5q,Y\V(sms$s!2d #,*Ě4R%17%օ$UZGemxؙ8aE30 "v""i=">)&S‘)uveΎw/OL~w,In9@Tq$8Mg\p^s4D$(5RMn)W*>'uBljӈ!.Nfq픯+-YyǸz\qN<. 1i`ԅjm "U27N;grIfiǺx(:CnQakz5j|*^pX[QTWK3>TX я2A(b%)B">Ց8Ai=rp?wZr H?Nђ/\XO҅$ci-hL<_-XǨ6>㨎@#"bJ"\ DRHb|<x2; =BO<_ɓdTLy!j|5Iˉņ=Hmc \ZRdFQ9=t<\~Vnkvկz>H=u{Vqn\4јgm<`MMmP4֍L9ĽgHlі%V!yyH(ZIC:ѹc S}gT 'ĺ*ZGC^BQך 4L ;8׍nv/}'~2?o>,5Mե;^U'eQyߧ 4$xrzaXw.[d_]rk2 狑qȍ/*]3 L?٦aEhPP@'Ytn3Y: T*^)܇h.G8#)!|\jhWU&N*N ^8N]Ue;3O 0uKߘmٷ E˙MR>S3jHHesTn7@If ;!1 5 j;XXUŔt) jUG8`eٶOC0]J»t^su۰zփZ`}=We/%h,3'pT\xMSCtR-ցSJB-uu毳)Mu-!a@ʮ4[9)NE片O S- 3l J3-JR{f>Wos ?էz;ngꀳy,DխF;/ΠߌxWVhǤď-͍eƝn :K&D6F^9öB;m D t(b2T $eܧɅ+RqPpӉ~z`I&utAׁ7pv_oB =F6/cF#v>-R8nQ 0%rur% t4 7\ʅVrp=8oړGVɮ͊! l!sG N x$.ysc1ɕ m xqÊv/-,- U!Pϊ) HbP;a$x1}bx(OHʬx:(8rxEW{!sB L0O(Fi(ާ2JVخm+kt9Mkj 7dpto:`=^w}ݕ#0\wS`tk^UO5=˙*gV`夢*}_̶Eakq[J⎴nlS-KetYF˒pq 2!˜:Irž&LIX8FҎY |H YpvΆesНr}lW.Y:$CКV~:06}w!n1o6]lJNFHReǔk q/)1"x2) V /:>nDRX1l\(m|"Р93^Ryut('(:zl(ִ !94j-G$;`)9@ ̴g- t [BqE L)RM+Jd($TDŽYbxtPp-;7w]A)o?ì 4UFcgFv,>JamQ5=jLטع̃NE1>)?N8Ep0l":Fd8XYgr8KQAy(|Y+fl>#`@2 SdphS70i8t t^ VE-6`MH%ٹtHԀ`!}i-NpP>,NqlRɊN}Wم6eTou)t||4;8Z~c% t!4p!W͖@ bjYP7GH|m5 Qɼ4RM|?ѧp0d㘥p6-sSUEt*'~: /޿~݋7_~?^p ~F;n1?_tUWCxb Mp|qYSn(5<$ JO.(ԥ۽jyQFF7\4aq]xWl RSNF4rJ#;IX8 K!: b&5:0Ľt$MבRgD(L 1HĄ`19s*RƆ!ëϤRHr+87ĜQvFDwmvii.Jj%6O`[JL.0">}o K^W[ym3Np;luJKIdRJv"ZD [t@Z( .PXC: e3q2>,-Nӛy h| SqH" `kԅ@ @>HGA2{f"WO Wֳ)-QGM.> NZmk$p|6@%zotAΕ-m土b b!۠LnLq8 N<èQKip KhJgN [!DE C0 ^iGjd>9 *@SJxA:l2RglPځD-uS:qCUXv[L|N8jnY.jetXґƜ\9]T|)H0~ n<)KcTjzU\YϜ<1wb~ nkwߎ?D\NN†=ëݠ< Vf7(Y}$-|lp&oViW Cώ+w0OJ 17X Cw(n*.Eny${anޔ9t< s]dգZ\q c\8@1e.284Vd2 wR3Dw=槡iy8;.Fkt8~oEvysރnØvJovjk IU B($|5jg>MLj9+pri%i}cVRΡIq}_= KF|TlRxZ*, %3뾪7Ȳo@twtm}7y-gb{W P mZ/żVuc;Oz0֗ -`nU7΁ J/VrFZ/۲B5GL%"HMCsZ E4/^.rQ BGSJm,BQ#^{R*"5e5z{:{ z-hn h498kѮ\dHuzH /f æIF?@@"JQͭ$G~ZevCEv`XB3Ƞ ָ x#(3sLF\'[o JqGscudXDcrX+"u3pv~ [ĖYZt뒔_{hb=E}f,aDžɥZ !MOO^̀"g֋$n緝%fT/D>~ okf LMsI$`2j-氄W MJ9~$nKLڧlu460ӕfΧ QD03|1δURKJX 8B NB -pl+.:`pHEXH>ATPhuVFJ'R:Dya`"ZРmPV# $0Yf?gN8wk> =Pj_NF_ʍCZЖ%01{\ ah傖2<17yx*F!=vFsaQpOgGQϨ[]"8-$"O`P{D2߇{rÙ xtFn"ΕEVSۥQJ0#2xa4y| [Hd1d2:;όv L> %NR߾eY rFGK\* R M1kA>1'GO['~ئf;J"{ǤLh)4F€*(P%c+5 }L!ڒrRCMM>w]FKⲮA]A'obHʋصw$h"9JЌiE GU}e/;xy2 ! &Rθe:"#:Y&E VqN3J1km^mR$vsv!HY"!(O'3SI\cr0TAȆ=A?_ײf79xU[\=*\o LU;s빪3Q _p4l֮Mm6ˑ@h{չַznO5cKWϥ˺q:.&:UXĒl_Sg7y07r wcymꎊgGT2-KKs>y}l˔e?nvav.O#NV+/t2խgWFJN>1ؓcʗ/r+Cw73v(έZ-M Fsg0 eXq%GI <oB{E;x1q]6ao &eF2 ;#ՁHO}ֺ>v֗͗#E}2:qw`K,4C0އ^O/2+81Bk9w0@E-XmMdژ&%iMǂԴK|h/W޸5T$Q=h(Xj#ǮE#|`r=)ZcKnQ'mY֝g}rtb쳕s9n)jWٯ<',)HAw䜣zcιJt ZJ0oB18ЛR3~Ye8j~#T9˧LsYqzb9. ɳx6#:1Jj|RE &PZ+MXHa*&FM0i9WI(ka@7hdq]q,FK VH #ugwpd("!ӫgQ)ex^zUH >{,b]U*W"ڣŰ{w*ՄnT??|滳o_~~Ϳ߽zC~s׳7^@ NP|P:p鿟 M붚Ms {4hM>BCnhh v+g6 _?\x1'uryj*~-Sa+m\&nn _!{7 & = ݰ MĿѵgna3ЯG ɑ43IE@Ka(lC,TQ&"#a1! VGz`\啕|<ƀ#@mzvo) *HX ZAb>u\Wƭr5^s%Х$P鴲3o#3Z[kqwah3#'{&GzΗUh9E2uX՝zT@N8}x5<+CtEra5}Po%<{W.MQ.IdJG:"[U\gyv$?r侌]&g$tVtNsԓ?ljfWn9+]ԕΞty}bT=7ǃI|O J=tnԒʭWj,nu48<[!'?-ߍ?rLten)}qfKI S[4gzPn: yt#kRhXb R]&FL!ry*"v7&h$GN5QDwTL"FrVf :|]&rT@ dZvzֺ۫C8YV8/<1=K+PηAz;xjVZ"_|/⠘bl3ߣM:/+o0YwoKizev0|?h8{3R6@A;O@cӰ5Cfå`[{9\{ CbIg7UA=4i^m^q q JXFLk^%tֱ$IO=cEౣ_x@ )xo%Z)H\Nh`i@%#d|l` 빱1$^I@e$gSG v'"Vb\ۆb\jXv_@]]iЮm+5z"ŕ`#8!LCK ]q/VPfr`VFzcѥ&Gt a0#hH&R4f U6j%J'pu"`,,.rBAZ!A&'7<3`(UN+?ugwX,cb␭ڠ}ئ5KJ|6|}Wʆ(~BF-Œ(KTEL#A"*7Q$w`˓NN"F?\) ( b"dxQGd #(wMfI\1Τ!rcRR , D'R:4嬵)gGYFec!y#)gZȑ"a: 8`77wdDzd'Wnɲvc2z2M Sj N@1b.2B)`G4@Q~-*s?NEBnb rS\A £=FjZa9OW3"4q)H!*M0CD+ \Z[XhggӁ >ZѫJxT&ɽA*1iUn—.!cejFu mvD\c8ccOusgp`v*"@>P=׋VhD b|A<#k0V/QCYX:[Z̛Qg~#UD8TP}9d0/AVR%#`+E!\4i#X"AKIHhwd4TeY,jmьŠxY! T8@R>Bhsuj˵$AI>NOIFv߭^ݚMᦧǭr|698>MbHBdD9#*C2: )) R) D vC  gjTPe&DL=ߺp BHB$NjE rBϼa:W7 ?">1\,Kp  ʣ꽸A$c^GGr[{ws4 hQ둌cy@~ ky]!u{A3^(TY+ѳ"j9K'ѽJ,*GmWVKQŜ '$>IKq9[B $CHPrA&:~a%G:L RiM)`*s{usqeX"JNT┑2k':'B:'~ܡa HqǃgJ%MlrBǁJ$5X*@ PH_ ݧr-բAMO]WCYFw֬B|  &BHŠC˫1^>k~T_s_^p4F\>W0Rf)Xf }/xCx$M ѐD3IְȈINFISi)W9G]Qxct1*!jc[GH+"eX(O@]rgWRIaxYς'`iބ8st-Wqnwtzk\?`u\~G2^VؽGfvԭ>|lYٙW p4\` vOg0HRx>ptU3z8'=/̱i GV->`'o3ycn~O;[]sqCx~:Irܦ4_31x9TMJ4X5kzhiWo<~/yv.6[cH.|A|F>o5obDkWUH =Y ;~"P/iʰ(A}|H`xBYȢ;sPJpB&Ct \0(9h1%b 9w{|5@r-]/amO.PfJde&XyVx#N8ldHй!%CA@Z䒐"&tٌ[KeՃRe.r$-Z2G ̤֌ɠCJ.dUg9{JX_{5x<>EJvg<%ZeQ~>_,7Map~ӿrd+12BJj0I yR~ͅ=B#Y? 5CI!i&ic`AHm6 ycbӔ0lr("* !:kqxNH@ cge%#Oh1rV/rJz\^Vpc=jo\8B>7+]?=5fam`lk%:a]s1^}&ʘK N'gZWj1 " W t[.@=}Ye8mJA@*:\ՂМD$$2:X΃sSFOW1>ql.l9{c޷=MVy`v!g;7((IP5P **ex K:9ZA#D#d^_A1@@$JHmD]H4=D)Ld@ʆP-W.1 #ԂֹE;ORD..FykYwPs~ D1yպG,(n7رuz&U8?~b(U:I936HYdԣq$-f$WOE•Q:1aP}s])(Y;ǽ v]Y-T+_ h[|t:R)hZJGPSi ڟAm4B;:{ ,]؉Wb vIj+tu<Pgo=Ä1[E&m5ď́VS|7RDC#jrg !*k *\z}(qn\HeUst` "2$+Sd~ѳ$b 2xGn $Q%>F)jg IOL`Az" BX-u<("J/s`HE\^:]T/w6iU2.glSKKKx\t$HG0NB?8/m%J {VOߗHxXy/nN;̵h Ƥq!((2 #9%Y#^w" l@uսa~CEEm Gw}&Up.ԩ/gkOUSh+TPWhH> kG NEXdg2PV:.R 䅳s9> r\ra hц7+ ~f0.MF 7t»!-~J(w>ynzQY}O7F;yrfz !F<W/T;.mb sqp</_N2V] Hʚ3,.UҽDRI$PV2 eB.8@2iT:L 191!8-"RՉݿ|iW3Ut9nT}P ۈc'Q!2Ez znu{$EAxHH*ܩ6 -#VyP Q&e8ѱ@CD3.1*/kML(,/Eyξt*yvӗb|c'lzޥJ׷;yrY;m)3kw[ &CRؔTE.B؎YrI91e`+#v5q#vR/]M;Em^ۣrրɪvX*1X&߬,+Eos 0<,ҸT.,!gE(Ј$,r 1HN YmP?`D"G\9;itXĝD[PJ Ge;YjxZՍP'D.8b@JG2Y 5 PT8IG'ls{YKKEUe=.޹4D29L]Ϥ2 0e 9Cyˌwմc[<ԕPo;5)Tgia9rjx )`c$]-6d?d l'sOK־!kOJa-A=Y{GϵeGˌmY*ܜ;UQz [h! ؆i lџ?OgFMR=b}lAᬶߵJhw3}=3d+oeܽw^gncP䰥DQ9WIȠ .:G@Ť lTQ2䨓B(%uI Iil/`;?\c l_͐LJ,")UeIXYdO#NAz`ߎG5d5s%a4ag90,$"t}J*)AG{Vȇ}y|*UQ vx975RX l" hrh. )EbN)xbz{Y ==|q.O6 $ (,-G˲v*+fȹ:$!*kEF yjeA'Lxzvon[π[M: !'KQ&YByB1C#dW&u_@zҵ ^ &`2ix>8GJI\Ry2£Q$\rhUV:wt<~sn84]Oe;υnٖs*W/cW.O$G0]t-{^W%pݿw#2ﺦH;fBɎEe2uAr]eɚ6o^;+owXnW|ڊZkv-kf۵XfLMlz ꎎ]m֝ :KK Ek(ѶkD_QG^84%C 'RXg j"ZZ,Ӯ_멲d?3nI^U |٣O9jfEBJrW1 E:8LIN"(@C]A  ~0IIB\K7{B~c>gа[o^Ogܕ8C,qg ~=ݲ|Gi0p=Zav0u8 |b`1 M'ߧl+ض߷3BO޲LK*  E?T3 w).$n0o#Y(Jz I3@%Z!-<[ q K~Un*e]p0>Ey]I/yu_1E UMw^{juzwǮO4 vJ^ϟ|>Gs *\gh^q@~$dCnM}Hk$M-Sn7 e~q MGN̯*4Ѝ]b'.]Q(GgCx9ĄN#cv""S"9.hbpmǷ#3W=נS낚gw9}D!]0,Є!4@a "E%`fd[nd {m[Ys .l" L8+1(l0/8$ <r@E.8292PpB 2M`G ܐdIg+E9&H@H-UXPPSv^T'ϝ$WJ*YF 2d01t 8g`qɷou)B1.m̒S*E!fGOizP87YÛ_$\ңQD@a FJz%%Pn%n)|(%3 :V'!cdA EHRȓRi=F`t,',/{/P($2PjK.p*4pJPo Rdb*'r!9:kِ@=b$vզTUESДB#?_^`Zv~zn-J]Hz(t=Xa87I& RJ:Β."޹@{/uyzmiCBDbRi(YRp匔>c4&X !mhL&yŵJzb1Zѓ8wx BiluRNG͟tQ)j2E#C@}XE&+W &,,5bfͿRMip@ps2:XBc5? øT"(0Hؾۙ5IiQpK@Ya&9;xdC\ LY^LgiݹְiUFGRlЀ l]ܒJz!nblEOJ=)}{Og1eeyKVh좏j>Г>ONw9u~z]u[uRFڪw>ߏ/ 87oƳ_FQ;+QQC:YcOOHo{o_/?9~c.?>~8fxJ,O@=@Ϻѵ57ZZAׂ׼n7Wռ~f>V~`spk@Rk~(~{:ػ_lx-&Eħ*p=4ow?!HI,bŻBj} 9ժP &#H1RT<9.CAY'/1/.t |,r.F㸖[H0Ak= {=k #25fŃc^JiE+ASHY̡S^U8[K`rL۾{ww>ŭi;OD)5q;8 ԻO LZpvaR\iJvJFL^Pv!΂[ˋ7%yv@r$B|t6 .IY}l: bXӹ][oF+^6AT ㇼx{}[,*˖fEY俟js4#G#шl6U_UWWUkjNwͥ륻j:?m`>QU>>OFF 7~b *C3mF-iO+WGmؿ['ȱv 9Y{| 0e h zR[-C(FߒqH&r|+n;HG>zd JIc5<֚bkU#dҾl[%FC!$n[udܯ|tY(em/`ǔC>,rBl &&Nd]| 7&-4RbLۉK]}hjdJMN׫V~Ybo$_ow~1i KNxpp%fr#V)ZͣnQ+ջq[LU&f鋫bEl_@I-[+('`ǖ Ɍ0!ގ'7*bgdEΎ/_ϊݔ[֕vtuF̶29M'On_ z",wu*O@;sgQ;^ ܈2q%VRBxqz0| h,2DzI߂6*jB6*ZYLyA9+b(]k gNY~y6Ƌ_/ [ 7.e'P*m#ك٨Gru|Z+QT F+ܨ%S|2cG;_-צ/lPHWiY:0d.9)#eZ⥷$#H Fc'k9ﻌKl*w3C(J-G'RX<8˒(cBΙR1рLIa"1M!7noR\^Bq<׋OU9 ]_?Q5vES|\U7.疋goofnޏ>_Gb n6lr},frܼ&xi ݲޏ]ߚ ,@fOn&"}+ <`m b,X1%ϯfۦ]wI#\t"AQOЌ{;:x.g=</YCpĜޅ2D8dQE>m$p! ºi'Ji yfVu0Wg<쬿15!;p2Eμ=o0 d'Ep>{>S sw6O|QNMTgiFE˻Ȁ- e8ϔ{ eA4@ΠV] v`'F{|Pָ$7h"Ah(dƼTH E,l:FFqAgXx_B҃ˌKS MZ8^DR9'X1W}8dWK|NISįl6!u6y u}j Q Cl~b@(yI,LO6\($ŒH FKRU6$($!)JJ7J2Y& `-kxP9OA`:4/"".Cx`昒1TS%F9vF8u^"j|qWlJ'f(Ճ)G6ҌΡ1H: EN)r I2 "ມR@_#39TgQzC5W^Po6ڠ+`(u=R dD}ps]sR\ՌAף0Xd̫4b=Hz+wz'B#PqW*둌Z/_d@~>k֢zgܓ;P1l!XQY$C9*[]ӣJ2ƣpJ-eꢋO6jɂ$I&VK d)tJ /Q#>I dH4$Lz0a->2$) ^X K`LRB2!7sO_fĸ)N"$CtgBSFƭ2p?QX ~]XQX(q1$[Th@J8 nnW_>7Ûpy/iq8r '3=F2'6'[/Isthpb{f!$ؒܺ}};=/?ybE+-x|O-Ǽ ϏGyȳ>yG.#?C-ЕQ `M[-!7<}I_J:Ocڡ論`6W ZL`!טC Dj9},T!5JKvx-ZudaMFE0+2%QNDF;=( &g] kʌeh,wVybdF܋kR\Fj2VqHUƹLmǮ8.\j]w#.hh\MοDv=XͶ3}^<дYq;[$ٸ-Zy(HB+ܸ՚*[z0 *뻺*T*9W0p@T9U`U!L^j=jE摠eĉRWX>S%0Ճ"X3g {޸>j&QiM\ ;\ bp-:̀`-B2 ]]!A]Bu9WKq*u_^a 2As~~-ϳ与w<'f&V԰tT~tjbjWpKl6`6h>Z_{yEy XIR[b1ߔ)̪4kUw3$ivZf՝}q&)>$VvG HDQr v?j.:_G5t*eByou?ڸݸ"Y5NX5Tz ×[5[eyV%ܾT.M#W+2\)m˕Rƴ'4n"#W,rZ2frkW+I?\kTWǒ+ťiA/mrurA?\)WXr\2Ӽͨ+t9U403 4r"W&]s+XH33 #W?Rƴv7[1+cړ2ekWhӉ"J2+{/+C/FʻwL;V'ʎMdwțɴ 42giq2ki"ӖbZ%'jR\og+w&W#W:grW.N#W+ȕFvRʽ=\\IDr6iJq4kv%Z\)ަMGY{iJqe+ /Wworu.rc7ڕyAʩG֦˕RMIG{p3f>JoYcj]}1n+Qf+;iJq"WJirJ6n3Ka"00\W4[(˕RFU ř l)M#WJir{f|*4r2R҆տ3i[ȕx+Vv [z}l<}A?}I{—եe/i8vhRͥ??Gлȭp!yJ)R6W'nSOgXU|-i1v'~ kG;xm_k_]_]~go^U~T/c6܂O/Ǹ  ,zn?8pM;xr雓l\r7OLn+% in/|̖[;$kؑ9YkOi-hh*1 wۨxN/nZsF(_dջ؋WN~x\QA;r`3p s}f [ s4t_#_=~/zλM@_K`7l_Rϖ5;g%N>-KbLTeWʎ^-!>6ޣ}k$ǯz{(g. g/Z[\ގL՘D÷=\|F$k( ׃[ ɖ\bw]H:gr3mV\J9;^؈G0xIC#;aA5}0 cnug) V{S$c#c҆sPM,R JD̝,RFҢkK@ϟP,!R[·!nvH2+pR'Ĺ3RœDh9{n98hR}B߸R1qjzL-x |& ̉^UM.k Fm m[h alTP>43Xj@cVmFM#d-,{oMC#FC)$(/@LȈ$<[*&bd/z T*JKT}wP9}nD Ue-G!H%5ShR4{HT5Tc/@\c5 nN'"Nؓ38:EϺ=b滌C=G!1a/ 9I cqNoJ d}i&% ,yrG͋"r>dk݄F.T`젘%EvGMↆE Ȏў;udVh o'(Lg&˂ф k!(C) ˡ <#EGGz7Zo x;ar2.5 "(g. \bU|Mšˋ:Ʊe60.Zb",Q@14<ƺ?K+5rzFUvֈDwܘȆRT9mEX6[ s rG߅( ;O-FC3#\-" c" +ʄdJDŽXAb$TP\ZD2 W4->.ӛ\5(g ,q v%'? 5FnU{C{- CbQ lDH C:&,%D&\!MY5 |u؂ xa1g# ݄cb00z2Ss5iC%;|Ȩ`aXp`ɥAFz-VzzibE,b %RI9 ,B,Q&!JP/uBrAA]u Rq&.qQvuT"Fzm>뽯 9̼cIFlgXH/>8Bjz" %DŽͰȲ(0ni&{jE}l∽}AFufAg -]/ĥi&b̐ʩSa:&$̿$)0s?7.OWC/g |sV11Qwu`H6#&f3x:pa$tdU]`jm hiU`ư'z$;J@GE ]3ʃJ I"912e(X,LrWtxJPBb3,k\uU;do:)XTuNv#[Qh{BbމlC׻l2^Ab; ~&Ow_>ey'k7YB,U'XOXk3|lDHcc.)p؋9(P"QPG݅ZRaGqCX XmLCQ Xt,!润3*Z.fc;6@DN0oBJ e5m1Q=d$ыwAKa: 5RS{jRC[8u|p\$f,/g҇dS AWеJooY#9ru@-ÛZAS9$*\d! ,^X0вB ]6>/=Rw!|?ao!JS@tQci(ULzR [%T" oksje, Z5ӥ#D``.Tގ@I\N/ET! Ē}b(a$xKN`F ᝊX`gpKjMDzq[-M*`B1 O)* 0UOKNO_nN N(-0(a. n2N/0zP|RlFΗLF$Pf>U'ŻѦJb|=]_w}_'zc1WeLF5t^&\1?|i>2il."-e{f~>Ԥn6k\MPri=(gScSz0ޔ5fk ~v"D񟟠 >e!g@D+F@߿[ )H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@_'8s[:(:D%Dn@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ ]),%U*%2+=Q ڐ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ aiO )(@2RHaB)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@ JM0hADN4Վ2f++l#I:]!JM HW(&9yWi k_tp΅ Q]$]y\~\fDWCWfCW١Е`1F9l~^pj&"V-*[;*ͯYSQ֘hUcЌ0yltMJ(\ѵJvbձp» x !zTe+ZUE/ҾxߙO1lエRqJ\T٬P?v@㶤 &eyw\bz7=;߶B+BJŷr_=…Y! UzS2n2R1,ֶBŞiZ/E,glӷ6`ʝ:)\H\Յ |Q \D&^BzC;m-.w]TVB ϋR Xܗ|RwrC Pnv#C*I@ײXxUNy»3t Z{].h9W_'mbԂ0ij]q{]`+]6tp5υqC+D9e7]`++d.th:]!Vn!ѕ5% 7]w`M@R8_/+(^lꛩ zs4`Gf/AŎ2Q|R7$蛣Pǿ+>!Bwzrʹnq>\ަ3`\6-RX>эwm.Ο)>J-^'(/$95<#CZz2Ц[sGUxǣ/e JdZJw6-`p[?k=^Jf3?-Oz4k׺7hijM`x}2iہvtTf=ls''W^d`-CJ!Z#!JU;@WM3 앮&BЕn(:]!Jω ӂˌ }mNp%3̈([;h]]YEgDWyE6tpͅC+DIAҕBj]`%U6tpu6 ϊZP!ҕZ@} ת\ }z `+wV=9V7:wh~ vDي@W=6]=孥؆o:7(6rwZ vCv^e7NlhlhzNӈL4}84-eDWl : ]!ZNWR2Ktut%NdDWRxLBWVԆJInlN" Y'|0(="]i &ۡH+}Nt(•.BZ #z<+B&`Z5tBR]`^:u2e+p}FtuxWְ{WRpf۽U'N{\/& vA@W]o}kzxbj!4VޚߡVVBV k(xU?6esNlyV1aT&#5ˆ1uvt(9C+ge9yW4R}uLBWVT8DN S"*D @.tht(]}=t%b̴!It S0+J: S}/r s 6Uˁ- :д ~t{eZ6fhg5XъE$ͅW 2nPhU,mg:\7o@Kxl4GΧ*F[TplKs.Nx?7%_`)Ъ;[\UhԻZĦZ=ݕQ6Os(Mm4v w^~ ')϶¡j|6{quQ{c?O[d;Cۜ)+mw_w%XL+UmjUȪB`SYSIX+ g2&cLIuѮ2rsz +%z״CJELNjpV\1+NJĦTdt-s>X).ʪ(&.j5ue]^^WV| U=A~ٴ2-Eq'e [^d v= fx6gؕY;kW8[s{loM=.7|vNr}+\W) zuq!!\Uf*Ϧ\oZ "}W\V;jŘ7Œ9=jsUӳ;ٖ=r%M46$aXɣ ܤW6V򥕕5l"m`uԤySq#!I/D! RK7̅jop-Qrq~zQM'׽icN)W=h1`AM{uz&/H6: ˳6*Aj/ 6n") #ʴo.O{->ʽ2Nt< Urάs!1 ;ᴵuhc ƹXKT)഍Lյ*mYrpHof ؟fBzBO\Ep>GfHK]bWnoE9-ϦWؼR!kne*B.Ljͬ^'2ȹ~2tV Q0YUB M *Zᣭ!cBCYg ]-b{Dz6k{{_;5r +VoJtu؂ʁ'F >4>ʇW9 !Bϐ+L&&H#p(XNsɖe| iƄ8F>ˈgFĈĈwb2Ke@ð{)R mKk 6NZQeJ3USnerjR(+KU{fψOtu*kPs}YcyQ̋xx.UT楶{_VKڻ2VE5B[*W\f/{ P.+Vu:mޑfdƛ,EysQOUܓdVhࠡO^aMh)NFԸPf>U'ŻQSՊY@8_OחL,:K*b2zg+ 3'*y s$algviNvrdAC7pleZ.jt5 7w=e If'n'O\pz~\\̋V2' o:V=$=Jä5Z~0RU(޵q$2?{H~0㬃MrA~J>ቓ{ 1; 9u1x'E:]`VJep \hl-0Za'!rƽb3(Y2b ԁ`r^*AE3iT {(wXN,R<"NҍtLM,NbiSԊwƑCgܯ mkN (JJ/`ɻEu8:IͭoPfSy*.[S\ a+cK#m -,X 1I@T('3N}zZUOy8' t᠐w^aP䚅dX(-z(H+qy:<\7ndUжa%q̚K=^t i$DT+i3X['[ = w&^[uմhM. d Gԃ=@Dvd/uēוn7t W6oI¨>D91rOD :h' v"AiP0E{5p5 4MÛCt )\PԲ2 ُVhs3}::r\Jcw)jz|0 |l7GWE~u d& qЍ/:J(C?b)~/3[_z2ו5}f"K`:V)*kβH*՜hv]]YOq{ .2IJ*{ߩ)< )-m wT]-0(x9I8]a[Gj Dʼa6KPY?[)#$3|)LNA#pj\=\ZL*bJJHS(B2>.?Z~ :J\6m̿"?#ڼU~x u1 yxmΦƹ5_'axl%ɨ )j,3'r笸@M¢h[R-֑y+Yk,p|9h f+wb0fuOikMt؆&Pɿ&fzh4(h:FneMFnJ!ƅBiN{M"_6@__ބ9R{l^ F9,(>-R8nQ $0%%vWz(ā.0M~ˮx.{]R Ҿd?SL6H>YLvd`g8m+㑸r΍$Wi N =fⴷ]KoKKq9cU`Yq#=lULtg?Ǥv mL@XG)ӭ Ƈ.V/:0 ,3J8*^Ub/tNh )u\7t2{+meu /ڪ+pTtw~*[w-L+]ݦڼf&|/:T%uJs?7sFetEKpSSErU&\UtZE)d4c_¼lwݩ{B=*BASs+f..;N>]tv^+k)L@yM>jb w<`hМ/gQQ/;IسLM9GQKo9%8j' Wґ⚐L J)|vR `ظ^moe``J5ɠDanBBE(1 )b0\{{~ڐJ({s%4*Ñ3ôu$@Qpbzغsb0]}0OyIgOn6p6 \+9Rq`PiV Dpj4t u^5@0! H ˥CR`9&r>I1$2A Ȥ6&+U;ux̓A^7Zpv>>\~c% ̈́i;pՂ.W2f,%fM§ X'N%ۓPʘYU=mկU~v0MT91i5T/vis$+FOr); kPKO=uCڻefRC1*`xQ1n&zsp>58>8d[mcMyӴ)|? t@1d%uup6v]VJҨgؠ37;?||o>|GLǷ߾Xq30A =_G{۟еk*Aw9z7u9~obho| .짋o `@ +O\?`vN#pE~I [*x '_~q[jdr5 32,{lpUK9+r昭_\^UWԐxP"AZrЂ6RB: pV|C|mr,uC:~~].YQ]dP44/)e1|gv1)=?UsA큛t9.29rQ_Wev 9\4v: 3pm[rLn)1H4}\@7t MG>zdgpBp%$2)%E;pVZE-Itlv+r }|''mapzE3@cĜE NVƇiݬ%M:PgdOT{ldPSk\r9^x&IgR-BRn\ܣȹM:2,"1rb dH?"_53r?gqRCswo;.0ߧ//#g:Z\3> _P["Xʗ{$~ ߈0Nakadԫ.p:#9ఙ ]n1Ivt}vV3[ҽCK wXH+j=cx ȰV #i-^W[։Otx2=rSz B]촪-{cӅjf 4fс ,ɹ1! V)?][oǒ+lp6 8`s/7Ø"yHڱv}g%2"&)CgS]W5UAjjКюD{}[߽"9 Tc EF,`r=-U @x,Uvlۂ`x6%[^XzXzm+@iadeeuzw]͍e~^)UTZ2g)rӾ\HB rv*YZL*I9@*D I \*I:LXpZє0kIL|-Qdw oQ~ԕ0c)v64 /=6FVILWOW9(C:h![~訰0H"H_PrCʕp .hFZٽ g7 4$HiMFQS/*8$ӉpcBęO2J2B)|VH.( E !@ARwHCQf = z$ RS8j%r@)Tej:bXGTx>M465ۙMޗg]Gg,% (e2,\2?bvs$Y4"Gr  y>CtJKpAulgί֐:ERR[\bqR#CBXH@,H a<3D:@qő<&Щ |~)u >tJ-&XNR@GIqug;\βCH\SΨ{?:z6y\z @$A(u2& ϵ H\ggS>Z7qV%:ӉRo3Xe""-iFʭ1)$U'.2 XʊK%[[[w-U pgb|Ⱦ;6eqo%eeD%=9O!: +ky1p q"<1%+xrw.љkc1iVon%烆𖲺lny es+"w,6$MNDrלvue8-7|і\D0Η4@*$8윦oIn{qюb,$Bjб$Ko gCTJkM[R$/2"ьEz2DP6]J<?rx8w+{NW ^1m剸 <^ruHEJDžL{'E;< <RHH`\qj^7Q(.(vkH]t]LS [XI{q)q2P+2zt?MEtBHu%: M|B71)υjA SP&AZ:G@4hMbI3b{Un{;'s2yEBCgs,g. Kkii;qE )Bocgdcs=^!]Ml+pȭ8Ae`E~9O'.*5нvL K/+nEHeu)dޢ咾!Ylo,Ν2UgoM7pǓyspxqq[E>rb߶G=B1O%\<.{{8َFṈu4,mm}ͻo-ܹ]r5-n"v6:lγ{]uGM1]GL[v-Usŭmwݭo)5É$}8HŔjtj'xn}(4Dn rA,0Jry),WJJ0W( \eq_ \ei8wR*++E8c ] \eqr8wRJ++MP \ \eqBi9{vk+NU>?(W?PaP #(_4O~Uq.Ե~)})"WJhҔ/cDM77?JvSx \[;)r-,-#䞘 gʄr:Ն|a6ƲR y\C(?6OզL)ͣ?!s%i 5豭k$W&1˾5DH%cl 8Wgŭ Y -GMsWm cp|v釷^~x$Op49_Y~&\^pC pBuyLmR!X.J>Y嬵O&f})˖*~٦ԣa^IZ- }BkhJq\!gt%K5Uzk 6zk I ,Y`M/\ \*KW qAp Bq56YZFp ኃ4.8(BUԹa%-*KyW) 5vWYˁ,:]1YJ#zzp%* .ꋉ]ؑ*KPk+T=-(huJpn:Oo&7E]UMGr6{?{v:t0kjNN7]("fdWc߽A%{J$Ry}➺=6 ?|R ӯqJSyye6@YJP5f:R\wi;jidAϧ4^l՜~:,^՚q;JoOFЌ~M)H՚t B)jc$ϋǙV;t .(CS0$5w&NjK4= <`/K2%.)85Ptq]ER=5_tUwzôZ|xPy `#ŽJ)I.ъR-BLd @V ;-,{=V !|:{c}`IHHuA"E# 93DL$XcWh9W>TkZdɨW t xT ʍd}r 5&*"RTqdds%om KH0䈅6Jb9j"u"9bE iKYZ^E!`4݌8BMq^R,+us@}}^goͮ7FrwM`"OZ憾گN(/C-Ә#h ȱ`s/>q&F.#jtro47 WrA Qloj6GCS6$rԪ䍣H''J*`)nÁ_f :19+c+&ꎪʻ_M{u.zjEkmFp8`jk!7^\L `@X[R$3 [OjI[l.xݬ+VWYi nN'u6ݭhEwj1b6N\8#Q'W\ ػyެyN]T̮X_rz6:0;cJZd. ;[Vv9 ?_%>,oyk $4jHMÈaif}| )4i8K]pp~ѓ՘tL>bM6 떣NRm JUN U֝NN5:f{?:_ç>_>qa?>Y25Mav\) #&pgz=m M-Z6W= m.a;cƭl?.mp-@R~|~ߟ!}|~x1:~{g~罕\+Hv{629LwQoft+/f-X.H3`X_g(`eW;Z[M4q/ʑq 1D+\6Z9N(1/7\$,d 'Q mD]#m]m"dȒ%DU'ٝjjmdS(|f#i|g7N[X)4@p*yKB XEty H4d2Eˍ2iJ S"^jx$ٴKk9 c.1y!W D4(1BVSN0 lh#6Bz%`i6crQQ+m"G;٧Gcr-:.j4:ztd9p߇ԟx4Jګ0z_k~8ޗw9|;o|rom~%xۀh>i0hm*w<0LdZI笑!}o1si62/߸\=Kk,b) ]kz%fr)+6 LIw%>ަG?{6| NlFwwȸ a_|z46-=]X|С{3^ȱ u8YW7?݊XnVma;LPO(aѴI龤quCfԴq?ZG`/2>}I),+ vΐ5Ei,B0#6LO[nm-tLޯ@ܸ &3G; l;mB}i-8v癠H`g߂?ۣc^gFZ~ŻZ\Kqqհ^]8?i8Vkl;\HK  SL׾Rw4[ɎA]7>xeMR~jUcEJTr `3Oq~S={˛OX8fJ$7>YfI-mf<jAn.L% "JAldr!sBqn^Ɂ~o;A[/:2qVڐ}y \l-%#ټ|PʆQWNX0lLxfT$唢 CDqo|ANr[j$$+Rd6) *d&/d*8/V̳Rh8m- a"p&cEV6%cdн-YkSΎ/ey\-.K7Qh0$N#Z؎,/!"f)o_;Oq`D2d-S< hO(x8ƩB6y?^iAf/I8IeTQ ѰCHY tC2+EN)IHRFdJFȂ)7,gRIefC$7{9z(ULļ26YX0%ø d }VHkNP)g`:ւo]ƂP hG,87.x)m@šIC@{JZHNƎHiVlC~ k?~e6eΠ<@DŽVPMǣ684DCH*a3 *Y«x'yIn~ IZXH@3jo;Drv ±!BRyWl?_MM RBHRɳ+Ƽ.¹.Ή91;'GtNl[:'~ܡѕ e HI/cZgìV]drȕԊ=4tuCCS_p\ams5JΫUlQrk4P3y|t Њ!9]>:\CM.Ep+XeW$YK/;x#Y DVza" fWIn2I#s*4)2,w)2`!Efd.3rw;tyext< cr0_NȆ3A35TNO<{tLd.2ٝGu쎮Vxig%tqFɾ:0~Nx5mNNhsx>uEլv&;:]ֻYwH-+jYwnwz^5<=ܾW\ּϞ~}Y'(-%˭Xa,]^yJ~\ʚv(}u90\3sݵ*tx&hۨb2Z YcAT>Xy ;~ :wPw^E"jY"<*'H VVIB(8L;CުV$j FT%4'A̅ƒI-2>32wom:{rq__'Jt4_'qwb+jtnڬDg#`~oAdwI0I[R)S֖E`-G`A~',5Qh}q%\;Oe9$1B'ʑ8q鵊+_s7sl|.fiR/M bʚJRʚjve,ʚfN l뎰9:?g!W8F3B6dWdTEWWWʄP(\vx8nbRG4K681l 朕8:e5 Dʋ`Mʂ32r= pi|tr"UtCՙ!M% xPHjT'Ken/6|y!+0<-xzn0_υ%hb<;n:.DYWӯnuϭD# `V g%YexW0Sg-zG;i6BK\!W )TL: &tӺcw6Wv=hG %  IMI9RTLJ$hΒE@%&J PXҵQ:&q aڏv>ܯd(g}cDOFht3G d_"A.r! ƄVp<0i :pPPQ !+9'*|d tg9b&$#]\ d#()3DS8ИRRųHr \A()c Z5}H&kH !pɭGCq86$H6î_(3#; U1<88%I~ͮYUޯ3>;PaI:5?,<;9OR(UsflzfeQ& bF&BJW&2zixƓSJ3yJ4A ! ) 'e*cT(b%5홧As݅!魺k;BAK#{Z@'OI8-e`%K&SnvL8j}l = ZߚwrW(}7M:цc62ْ7/.Ӹ/]S4<Ș@1&\22M6(GDq,0axOVi5N7Kɯ;< CN8E2L_Mſ #o+0c凳Q?SG__]94&.M@/.a|~~9:]- G`ǂ&{AqbF\ϼUyW|Ӽàqq3@2^^uC)*]gIp c+ +HA*#)|s6I0ηxFTR-ݴm?QͫMzJB~/.*91]h8Mr\k Z4hb)J0^)RY,ZJ]jY\B1% |0qF̻G]yqƜśKkh;kl>Uw|I3+4nVB&aR#|~z7wʭ`7 bÏI:Bw t]0eAɴh{:_]E[ލeUJ~2j:_|/-¬UvU> #߇'=cԯx1S0(B.?Rfsx>jRSq^Up ZeA76q=m%:Cwf-6vtDk(ѶkD͛Ϝˇ'u0+T)p6#Js!#VF频ڳhXP!!)XiVMb_lRKZ28lz܍R%QFJ|JQI;K4x >&=1m( aUrOzJ`(O4ؑV.&7ڋiUw2/%MPvآ]yr4p:-G ӱ'9ϔm reЍ@ l_oRA, h"i ")9i!OVolbi?o9y0g6)~~a?~jS0)^8$?\җ//Os"MO> \UC9UdWfsx/۹[s$;Dhpbr+v-{,9o̎bېh\(Gu\"LD˔ⱴ3(xxp+DpWAEo3 9I2Kǜgk2h{!Fυz.L[%Uk (C<.ePj@hE0&șQ%'s܆CzN|s6RnW^+gq噇X efb EzJJ )E/078-"QV.)/#{ ?XL\5})K!ia JyCb8u*y58[o^"kpN@"u e;&AFy@J, %şg$<(f\rcU^F51eEm ㊚4t&7/$ښ$:M.gF[A&gU"Yv-oTN碽մʏGD.L!ItĒK%RQm4 h n7d^]XzO &#E /ɥhR8; +f<Ό*̷%[r򖆭pp%Z+KUl%r%`\`X> APrɤՎ:.K)魡(-7NRQ*MyԂ3>`F5O\Mi*Ax[pgO ^7uj-)ٖE˼{^y*E*Z[k7KMKkk\DVLde+.QÅ=/‡IǶ|([CIXYGW #z5g~\iջx( L+~LaoDyPӓ=W\h0[ şW/xV76a>J5r@SPvB=VDbXz~} 4@ikm.4 h-#]iD\׬郡i&AUFtUɆ-5*!+zRuڕ&6BBWVʮ=]"] %(DJfCWװ\ 2yBtut%RVgDWX+ ]!\Jt(Evutd%Ȉ·ѹw%U=] ]iM)CkF(͆.ƺBRu=] ]-,- 2(%j-̈0' l "ZENW2ѕ\s1QTX:GN=֜+CYA욦/ggêbFя[xGUc:zTGNEyItp5fѲڎ=]@Wj͡WDŽ>qj37K3/o?3ʮ] Jt+j--`p{j-xUDfDlhR*siDmiQڞ5y7d>tpu6V%t(5=] ]qƹ"(uřϵF(6=] ] DVeCWW\ juBw$d]Iʇ$BwJq!dN_D"|EBW7CٵzzP3VBWw~Q*ҕTq]`MH6tpi6_jGtuteq+,YBBWt=]}6tz}Lj#gg6Ci?P2 Jt+jwn?)**Dbmr\ܮߠZ%';lEsk~%ߴ3;tmܪfFV~d f+j{m4fsN?춸1A%Sf,- \#sYZf/- J.\Z`R3+,U6tp5˅t({:HK2+/++H.thU? J-{:D.~2~0M~ QBn o)\Ɠ㿍c<@c}| ?M_<:2}rto~ o:X[7Z~{BYo }c$|ղz|9i֜+nvRSw:r+e8=KRuQE$g>~??TIUҒV[Ҋߪ)cY{ dX\R87K-Xf1*a;*~y\cIюJ/8vSnZ[h+ҵ eybl5,VKfn3)nJaYo&"ZfDW|\KX61Y7CC+""#BO;][tpU6}Pv-㢧'+9e\`=XWVt>,QJҕb97tg c]!Zv%͆& juB}>aխ/ր:I#bnt6VԈl\\Yx4,F({3L Ok/U7W %yoN(xw[-Ϊ65:qp3rqsj.C&n8ECU ,F)ϫ-p&h/g5Y…Պ5l8S%70wTzquu-7R[M.{S 󑪮yqwqŦ->._l2leKjG-yawK~d5cڀ~US&TONe/~ZX>>B*b $ʻ;nO3p5# w?fg0o>b߯gy}<~u~;(]A^7'z񆞩`'z%l(X޴l%4]ջc־l-K ۿB2x,m=5Ho]6~__*7p){ӳJ%='jTIY!ZOrKFffoKHB:68х:WkUffq!¦5⍬>~)NǩX4i~\h@Lu0&! X&wUt(Q6gBIV&mh-5Quj1Jp`$MԴA38mg{IhS@ˀ/@bw*c}NivI()7zJa>!a2c1^#1>olw.`tRU-^K~<q;X.8'9 m*F)]X#ml`eXDaD2R: FaJ|6 4f{Cf[OQD>=F\R_˿繶wbjSXګ:DyAL%;KT}A> RysZ [i9PJj֌VݦHr=钍5u}gtsÚݜ|KZ#vR-cGYG7(}m f^UB - VRPR@"QŖ%!BRXf\ $b4QCAQRߔκiW-`^E #48(&` mK &ڕӘ;h&fP57jaGH 2'뭋HzND ePBdEHS';uOQUqe$n tk)c : sy:M;*r9FTCJBi8()ؙHCJAT` `ɦ= pC4!hıB;S %E5n,T B!Ь=K* y'D)$h~G/nHPS*XVWdr"YYHIERV籭R/`3XL#PH/>[MRDfNU$1A db 5WVc{l-D &OKA/ho{ .4ΘuJ(No(ƌU)vNZC!'$̿2]vSz~= ~qZs_=bF ޲n=Wzlk݅LC6=&d-$ >:e‘Z/WIarqV2GP=-Z*h`1 $hm~mAOu~sPs71q5 oC NbѫxAi#*f`ʛ9L!-\^sԋ!+ׅiDF9,zRC%#pU{eHOA,2 ]I2Hd~"Bi4 65f3R-rU>]Ӏ XTTǢ ةMP$9e. 8PkBwm4XDq$R cvXE_jp6l R. ·, ѢLJmw|?}b'X P6+/Ƿ~8ѷJY`wfwNJrѦIn6g.ή> 8n81N[pWWW''ڄg ii>py}\8jpoYӣJOnom۝߶7 m\hcK grGmn N>%pU$j4Nq BJR; F/N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'w:'*'Ѓ{kapFe@Bs88DN@@񑮅9~>:L'o8F@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q+tU),87q˸;5@c'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N quyK6.LXZohqDvUbİ}u[7K"{Gʝ:{>\^mފ\[h@S l3d=C*OF{uH(Oq@.ν'58 Wn ao77͟q陡ݴ`(OD3C2V!o&T}dbAq4C?޷vr'moK|[ߖķ%-moK|[ߖķ%-moK|[ߖķ%-moK|[ߖķ%-moK|[ߖķ%-moK|[Wpi}pޥxtXj:Z^^o][fQ]^_\}u%Vحr  hmxv+s[dozMWouhy ("IW_aS*P`ׄCIW{jܳKz)(] !Lp}hO=] FWPN.@ϗ=݌M+l|a׀/ɗ'xW{go.B듡, $)-iA5 eePNn>[gt>#Σ} &7OO;1TCO>lZ~>'$Gtでn X 렿==k=a<k|ir=t&i߿˫vywEo/gtLyѵ R/nm3WS݆#o3wcn7<>?؆*ayzj%3:o~f_ ߷8r>?{@+c ׯjy\L}uDK8;Km(l<=918gkÅU1+ؙtɦ _tԌ 6Z;v}ֻZ˹Gw`ZጠjƅweH6wz>+&~F }w?z}q{b7fy~{58.P\+{]M}Ÿ M6@ !k71ErIʶ 俟E!)i$QL4rYF )) \^pKTwɫ賑2%&`2HQ-2HkܖAf:f9&f^2I.3C4`UyC )/7K  r&{Kid)6K;nt[I=<\Gn8]3$oVwA4ړI'3:0+?BL1;Bտ3ōa ,YszET*VM?]^x nctrM~ptl|Αh<|^F|wџ@BHt4hFa67{FZ>,Ohhxzг1gə {$Fmk֭Gp>M+YxJ$z+Ox̲_7:TӞ碳hqSڎo?~,߿Ƿ?ȅ}|[8'0'I ݛ_G܃CϷڶ54ZZ8`hۜ2.9qov9nmZ $ގ"huH_ЖsKHzʉ-b; Ml~@]TUIw)YEl-GC}Γ /@SA]VjӾi]EH_!p$`rPc(7\$X q%5ZHlXp8&H kŬ,B AH 1,6XiE3Vtfi~>9(ıERde^NnU]DJ-%k=lg^L[IYa+_e*/V3aSr*6tFD}Ӵ5ց6D E906`JFFȝ"2нx=mO~/ARhBdH@H9lv:{A?p"eHvAEB Ir ¤6*|Z#g?L"8W˝їHnTeS'D:Z8/La5Mن!^J CX=fVi6fwue ^YFҮ$I&˜9*R1£ui7&#F,!h6 wt{]_Φv@o&{[0hG|hgڬفo;ªoV [P)i[0p)& W-8TkP`i.:3pYI/e ae̗H&2E1@Uݤg*Fjy5rc酀K/S8l+mtf)G,?8ZoM\Ԩ;tDȀH6:"51 ռT-^mh(W}Pʉ*qJu2Q(@xl_k̾;w>r;qc}h>[uzom~)J: zF9+:+鄘^Kty}MKX>_G 'څ6#ś;n d a%m4wtcNPY'] ǂdG4ַߓXk ܯܿ/v1-&;@wPLzqFɺY$>y-:FRݸ*)HEaU9m0{QFZcރ:w+װ^kc~G6Me!AZ#3JX6ϧ0c!ױ璸L{B%7č[ #$yQ $ӴH`FفZI!G/*ׁypEx+,;,/۲C@,qTcQOD-yTGd,Fkl7[BJA:Le|9uΐ5 #BF&sۡ?,:;fjΉW3ڙ=Ѥ%bGŌ#Y gDە`g=J:!$ - TԡDbUVqFuҧ-?f޳O}'Z^E~ fI ]x*Yrd`;U~irvlT},IXB[$Ü;U A'$Θ]:"'!w,* |y))Oi95 x,BKiBufƒ'5 i= j|ILG yA;9(oA0 `AhdνL*#Izl6Xg#8*Sdk{dl H*-T@R6NR/Nt\*hU8!Gdu֞Gh+9K.Zٱ0_k[-3:e~% `,8fP'2Y 6.J| тAR-O~ر}c ƪ0\ wCeP z>.GƛW:&UݤZ6y ٗk QBOV ](-{e-JPP(rVA,y(uL Ii/aCR.$b,x*z*YA] 1Y#D Z"5Q|X.._BDHS<$8OP0DrLZxJD0E[Ʃ LA8҂_)pe|TA #auY&%e"dRWR4  Hx" BzC@!gG^dgHz3̆܌bsp--zQ02mҀT[aJq)2+Ɉى1HsNP13_kc^`/FB  x/(fPA䩼E)F`ZŎǎǦiQlC~OkˬmA[zk۝A;܁Fp0Omɰ@j> ۨO!*$Y9";KJNEZ盋t׺k{D:|fTiRYU TYb Kbܴ. VP:!b U EtN ω9=sb9e)M@J ^h ZE2&nA@+II8D YԩN$+:kGYU>v$=~ W[oJz:ob> 4L{F}`gٳ[u5iG۷P)E5~}!/NggSFedK$w+Xe)$E^g wpg 5ϙ7홉.q)$ I0Q%AfNF4g[4cְI S* ˬܥTMKzO8$.2љeG')K90te!wGE}f"2׵ə\g\+lj{z*!YlΪBHb/잮ZǓyg{ptsܡD$m6SSMm.~"~C.&isjj]Qzuh{zٞNm'4eE-n=]Fϗw4Vy0w;Dr·yBۇE]"WtQ@P tc;[s5UCwKmmӝٜ]KڹƝţiu'r%Ji ),sRfѶ`(^W! س&*Qcg?:C(lw*QVȜ>prxHo50n$_h2ΐU]2j T%>&D ti hLf3s-s>t%@Is'%7]OfmO*Ld{:B_3s$ТKIڒJ!- ޵>5r$OwP팽Xϱc󄣞 $`f7_VwHG Zc#YUʬʇU {,B*r wNM1V<1Zl҂.m6*GT KIwPDVyZY@WEͮ|ucr6%WeB%edvvWrT9U6RK(c: ΄܏ IMHfB%В.JK@\ńKB{!V 8FD%:s=  -v1waPXr zf0J,4'(xAcf49O+ .Hx;sj.^ˏ>BU?zϩ~~Sc3*R]JLZ۳dQy9.G ګC%xWj娧Do 1-^e+W JjLqi#R9 ȄYs` u΋ٝW@v[}Jo㛛Q"w Nx!.TKEJoj#cd! b <* 9N|0a!`Ql<)r59k8O^0ۆd˲N9vǁ:{tl9*?}d2:XA"7H#G#v H*%4vHB.K"P2zEX+YpiL@2 ,n FlJ6,[b}, ĵLfh}qMWױk|~ݠ4{gFc'qc%b ,!. #樋I5"0Jh>y=oQhd0 Ke7,_2/2m.Dʝ?&9O#ۢ6΋۞uV4XuT\ )"SD #3A7KJިh6'9Sh@0 ,T輅["Og`!fSzG9AR@0 0GAR)Wnl%?㉓{ 1'9u1x'E:]`VJe0 \hl-0@ ;D "$^nظ^gˈ1Sڂqy,,0(=%gҨ@MDfҗrxGЫ6*Gl S^CRL1i`|s5F-;Iu( QDS$(M\Vu]`3p ES!T~jEM>OF|I}5>i>O.{?tc-G_وG W}‰~|=MΒ7~f6U׏ZTEhz|孺/Чf0ׁKNOMb;H`0օ~kO3OӂKџUS}M$bO{ >JmonwU]c NWWSY\^I1=?I2Ca:M6,' g,4k=Ŋ@e^LIiHWԋ_-xm%.v]!?#hR~x Gb]yFMZ7| W=AE9d4!j̤0'rg8E K}R2W9|A7rWf3`:g68֪x֮ݾ1o_qgzW =On:s=tI5 LR53X1@kU\]#a0hilԩMr|]N͒s=cvR{m=] 0S,`Ð!)>3R.❉D?WsX^ f +_bGJP(G4R9O8\ƨ.z1MKb}"oKK`A*9`Yq#=tULtg?Ǥv ߾_m9n0nww:hx޷1Q2TyEWɇ{!sB L0Oi(F`i()5VBֺ͉T^0 v}!ە ľ*Nf{>+}sR XK+/iKFetE+ǡR݇E2U'LUCT32/kLV[YG785K|-qE#v|pV уVl>ZYK0X`"k"QKRTQ͍QmQ'(#Ɂs9hSF|Y'c mg¼ 3qcC5<10cpTQb Gϳ#xQof:P9}݉` avc%>F-3SUq7*2(PR+9;lWikWRYዴ-ZXIµe4n[/Xp φTk3 e†:-~>V̢~is*Ӊ^-pz4{ /(]Q[^LQ h[)z 3g:Eg.fe?z( &W\wpjJ91F pj3jxڌJWz\m=i+tΠ Min4\V["[u'Z ~)"Swh8zM\b^` 2u,TQ9ynKܞuۋx;2p$z?ꍀj r[ t`8]wYa xMN~*EǸ.8߭cr[u`|6p\0sq~}_XY_Ub`5b ڼ+Ɨ*Es=EfwW ;C>Q+R[2Eer2;˜ifɂC7;X9ee,֕, ʠveGt:qi:tWI^r! RKJ1DmY*)"Y + \CD-UmD%\A)qHpE9OWܗٌZ&WJ;zpŸꐌA XQv0p}c ֵwD傫aWo|*{*9MeFĖp"(E8^s ^ dU;J@$z]`K)7jʤ=yvs[Ykg9s9=Wr7(ϋoiuF$gGּkrҝ][9KY>~ë Ȼa"\OqGh7n~s'c胙zcta]p0չzJ4̨م%Z؈{"0ݍLnby8cRߏ"_O0ƂqфO〦?{J|*:_:"5qٓlꐰs>u<|K~-qwiv6Oo3J-!lJJ#68R$#Y@SJxaXe7?^n <*%%Y??]} 2l<9" Ky`"qQܽBA~US_σv"qGfAaQȳR3Z"Uۊ% h釋m+X0VzMrm[ӚV;]MK mʛ~A@"JQP-D~ZeSvQ?wKݴ͋vJܻsgu y; kS +㍠ _αhR aB^@(i.U(rneZGEdQ0A[,)Rɠ;tF(Ԧl+hǛb<>l5_fSԀ `,_;TNB\m2tYZ?S^~SKaெ=X%w)CGnG{|m!ӌl iRکT/R < = x*%eHT 9A=.Z0eX|iyKuy3;LёҳJf* D&ZwopHk+1٪ێ>J lKKsEf):K)in#G- LvfO{ھͶcl i7Q$%/ӠDaY  a$v4䌌bx9#tA_///:>{p(բ(4KzJ>EB؋Ki~u"\52J!u+yʒiKɁ^rWJ/0rI,QJmV̛@M"Q)y^QƦr9;[>yrdN1%&KUDt!KjM79ֺHK&,3|'.'q~tv<)2 >oqXۮK }VftOz:[}U{IN6i)~j]7/ްrq@ܺݪrNG:}hsX܌w-;n٭G'xO=PyGOsI&KL_0쫳~Zoܜη ﮅiiϠϷm+A6ʊLo#~`~Ʈ-_FfAvSrŊ$ EM::{3\Kip]G,JwD:L&Ebb3BzY+'PI҅| MH9-W: gnm_O?َfw]4Vmxs23CmmCL2u&1@$#"e`of?. z0iK .)]QƬٴ,VK2ʗ%RTEl9;yU= ثA9R5JJ|e+nߏ*cG𱁄T >A]獑FAB^FBZuҊkg] Iȳl@t!(&ɀb*Zg$)P EF$o :A^"er$ &%HgYAE^~}'l+糌'i>Ů$1KsYsOC}>΀OZQ&:y&ڜ $,]&x`Uz&Qav+"ly_wuZ9Lc8=RMHA@t%!pR夬eFk F1qJ)qRhpv@v_w4!zmg\+ƛI7y)cO֏KƞDFl` 6_~,$U:颠Љ'%!$]xeCR(ب!$e7 !+&_ zu.6{ia;C1*xANc17)?FQ/3"jb "IQ(gȗ3 V1wٵ28OśϋZj6v+Z?0;i#kJ4E$su 1W$ ORx~$MɄKW@ƜRkv@͎&~֚9')+vE>68ԛ̎-c[ubаBˁVX.!so!؛p_ g5 .j((" X#.V W}sݮ4*e^b]G./Wb;PW\[eJ14z=勗ɍ>&m|]^J4)A¼qZi^Wmx5e.Z/u2ȡOXځJ_&a7}a{`ACg.).x犰Y^]# W"We dJ-I$L]8AZ5R1;FJzX\f(UؼTȾ4W"A?~Yݽ =TEf9})fU7<t9B!:b%;2xDҸA6%JlrT(1NJ[.YJccGmf Vyukl;hoAS.]o/x&` Oƈ%Y(@]d[ ۀ,Պ &Gt QIIxyT4kkX L!NGt+̜"ћqfXld`aXse{XxwquMՠ77/7Ö]+9(t!X' 6fmA 2JgZkwH{] _*jvҪ¦fRCAam'|(ʄl+@9;M'%1sQ[Es> 0XB;pxEdJH# sDїĶ!X1"sBaVK!+df&C@82F1QĤ uΏf^hØ[sQ5FD5  Wl)A;& {Ѐ苩mXv{2.X[Ckjk)05:kIRq&QBo8 :{&ZAfG/2ĸuUkFɹ8O.xhC" ]݂,< 8eX ԧypx-xlt333]$_,m?G>"ݏQ' /96 4=Agwtz;6UnqXڮA OdI @͕(*5٨Dd, 9FYkL2)}/l7'k ܌yn8[nN24~ˌ&۽M_:쓝-IX[@vu JQRU, _ F"\X*0<  EJh52[('XEɲ,Iϊ$q6 hߒ/QUSAu;1}aKRݵBÔC3sNHA( 3{h7[0'T1nq- nLM<(be$YdDX aUYhB6 8T xt@셞y~8S\ Ff/K`re?^:l&2)VFy Oo!m!h 8޾0rz`Fg*evl"RAUT. :xDB F)q{f^{g}J-qdQDZ+q:n 7lėZxuuEyS?BvJ82KheT3ERDrSK% 00~V)$%2-eUx٩ڒrYEV uϥ2~Ϧ.yzhZ8 %5D n_S1)7y_E.sGޭ~swxf:xByʏ6h'_&[aMq{\kyE%mᦶ +@<*:ϔSVCW~"{a܅qVU#w#pC=w< /r?#MޫW}~I?q i^f rx62pUql 88Ho 1v/{yӬ) 7J NxxԗR.rDj{HD.-ʨ9ezz~+5)6@7QJݣ߬{Lǣ"Nt ?K\rrc7"pBSEePFH)~Θ T0!U ULuQٮȹGҍ+i`ٺ7M/N67OU-kU bS)1$~:ÈetأgBGB=ttf|v!,J0,%D Yw/JT\ȘAC@":1`Vl0+Q/͋3ƚu ƚ1mY?}_5nU3TjgzsPe \0Yk:xS;JGY'UlN!ꄎ-7yy#^ye//ǓO"י Jow?܌8roG ۥv~L8*\ EEy^z:65rؗWǵmEc q%^㫈E[Dw mG=ȋay1?{S9Įlg[qc}jk-7w~B'j~J7hZ c8vKTb/爼= BeIl}SDqs{1 >@) kh}){pz鎜lJ˗qzhRjXVyE 4 $=-#x-*ÀR˄B`cw ivƶsgiY{MA6VVs@[l1TնҀ$KKdYc'4x~ǚí;_um},oW3"Z+O`ŠFHo@SqED !*DPwYq]a$X{bg6@K(u pz ȣ 7D&x1k\9ggan,(NrFIexd *gGuʕ_H.M 7XLB%JkQD<s8deQ7[癔G}sX>ʥ~>УE]Q@@_M\b4~M-'j3Lj-`V*'ߕq.D=OPch~$; o5!N~:Pơ){AhrUQ Fb frsj6jU 8UEDz6ư(^ӏOŅ,3U^οƿ/BJ 9ʡW+r r|I|nqrMNMnOke쬹p ̮57|A̹8Y=,v8Rƣ.gə-wG/!-= .Z{ۺa(*h\ɧdfx44`vc^/kۻn͢i5RBȕ ÌS=ŸvmkWCįvgȎקߞ|O/N)epzׯ/83Y1@ɣIqGp뿟y=f˷n>._kY[)؃ˏ_7is4?hjq]5A\Y\OMy`HVV-Se7KC%Bˢo퀡:Ik Ya7&[ ]%fM JI0%֚o<"Yb@2*9Ae;p I}D0o2q2-sϓ.2q{e^7eO<qq"dMY0ΜUY;@@qD@5+BT۠$ue6%jKDE SYص|7^yֆQ & 5KU]TrGQ}QTk+$s7*0UV]WJ{q+F7 d0S WM\7:oXlO4Y!ɅG=a `ADŽErz_9yQLƳiDȒҒcˇFs5Z<~$=/| \>~ b4#u^Lo2T+DFfroU&W}Wm[/2zq+DPG*ވL!"ZRWi.)#q joU&W}WZyq4=ŕT+$XSqef_UV]WJً/Q\IBp\)).doVd˸*҃gaT.tTyN9VĔ%S ]pÅv e:#v5r[6Npjk@Ndv| w 1/xB<j[#҈;sd޵l%` &M>UI[я?{WH\ w]y  AܛYE=IlbKeZj([3Ӌ5VSn9b$kF+=U.6Om דGt2쪱}Vftz:[}]cG״?>s7/~f3KxëܺݪvN:}hsFM7e-h=}||D[-t(1?\}F*ѷۨ_I{٤6x|]0>_#{ @[)b^b&kݘn\=ع{Eѻ@"xP&"٠؇A'j Tta|&VF%M(0\r,lMF"Up*F,= XDhݛsvl>ߙf6_eq;#unyj_^ KL2u&1@$#"e`of?. z0iK .)]QƬٴ,VK2ʗ%RTEl9;ywUǾY>[S䰮+) ^"k~lpNj.o%E6'A 9&Z8algݪ>*$cGbS duYH(HYiWγeexYqi+^V]ΊI!kbdIr.VaXPD,mTC.J&YP:<+j(Q|~oYtJ$-P]Ii˦$2Xbu VLNz9/H!-?6:vz#:j|:xmfK-Uwս%USZ!1~y6T=jꔷSΨHC] ]ƚZ;LPIߔlŽR4omى`@d(%%&N6Y%:#Ќ"EBQ1KSF) :4#hq%9F琲OAlIL@K? L3s3ʊD'hrҋӭ_nJ`W "zjݵu7ǡH9Şzpכ~Q::iMU<Q)NLab|F-#Dgzf1|]ߝ~& MF{ !E)>[AEl;(] `9*2mB5(JԠrAHYzSڀPŨ|#BRG䭍F rJ[Y Me<8SFh_8ȲF\$2wPN<;XYdOrQyZͨ.oG4=jfWWJcAkLafDJ)R&GB`R:0y55ZX+ac0/2&d9r,$g}"* |"L4&kouSM9I(X1LRlYɢS7L!4$PV14H8Л'/A]Ai?WYwpzwK=|U_>]Ւ? WWO79y~ѯO>L&\WwPli||akǿcilZ6Bѧ۶À"gu4ڶT:RWTAP<6d˫ZV屵AKϖWTv̖f!g.i58m_Whu|@DNJ'QV1JZ+.(w\DȔZ[Hp&kbwPhDhb}rYcJ}Ii_/D4vR}ͦ9:rG_ Y (]N2tH!NXɎ '4. DpI:05nerT(1NJ[Yd)9GSʨn*:ϟ+ ɉ+حp9fU[ukXQp| 0P'cĒA,lA.2-;R m@a{0@dmC!*) o#O|J*RK129tLbaj9"2WiFƉX=  ϊO/*3H:xdK]~yT/ص JgAb`cVX-.t&jZ{mbf))*ljV-5*v‡L&stB[RP8e[>5 ? !oWWYdA407J}Il5i.#2g.ݯRluYBff(k.k( t*+#cKL[ hfa kl @"TI&"NΞʀ w530"~[;Eu%bEV8L.2ZBL^jvu >RtV`)P 5qsaq*6TeOTu{ HFy9t[qއQJ_'XT8G~ms򟽳=UۉYUBg)ΑQ 1 z<ɾ_;n?{ϻA\iž{>0q&B.Ps JhM6*=:lQtms*&h[jX3|*W$O*#3SI8>"Bc̜#<)b7OOJ)NcI~`qܴ۟*b 37b}WΦ:생-^IX[@vu JQRU, _ F#"Xy#08  EJh52[('XEɲ,Iϊ$q6 hgEIiTjAcrL_Twj0%#̜0IF؍<>ZsMJ_kw{뱃[zE|[`EMgrm9D)+V> Re%O 2JcB;aRE x53=#|w ^E~ɽt>ؤMdS2$#4AJ#qdL AnhA[救z tVRfg&1!TEBhL!!`t')GoJbw֧bg-C%CCI # ^xhw?w}02eYr4 Kn2 o|Xgu(i?'mah=.4M2 vͯq:ޛՍt3Zgu}fgUb*RwWҮ5HdR)]m&Ey<&goQsUD5M&mSuɕ]n+b|s?'Ç gvѪ nwTjb|c|8jۭ>޿ q_'GV7@W7h>OWW& |4>b׆/:ָkEYqUt6g+?}q_ˇ7rqB~9&mPDۺ@\[I`CƠ?~tms+ ~MZoHAV-q:y=VX gXwmKr،`,J:؛ \`w 2me3Ç98M kN:W447[Np׊u]Nsսe~EVUw~۩ҬmolW@mou~a|mR8) Y(a u'--YH\T$[7Kz&|V{"4M]y+ײqofFt7fFZm`9.:E 0xH9Ái&b{Bn$}c!f9<&?}7avdn;,m`>T.@:!:m);,-"3Eb2ggN~ . ff`ouH|}GnZKJQ!;D}SR 3aU󗦸m+(ia>> ~ ˓dހbt0\57=M]_ϣQCTfv{=.%9ز4¶U1vŐ^q|WCݘW#f^PzI~'4fޯя'!@,7`W7o`oK޾ _~rzV޿Iķ}=Ϳ6zP wWڤX?->IL9z폌 @D5Ё8@ .(\3wh%; ur`xή3xf_P$Z*yDg#1I퉆x9Bp/rO-lӓ\i4.i\ʳxtv5e쾇զg͚EO:8|c<ֹ48 2\:W3筛TpT'N~8NoO2897|3ĬDdc~LOOhjۛ7A HqC<΁%P5Vt:U1ѵ52xxU}nv8Dn-˵N9he0@n,qf E )/$xYY+h6 گ$`4K, 59xdĀdTrAe?p I}D4&!GLbyf8/37g1_^|ܹ+$"*'LTpFڙF576hzbE^y[QޚݝiJTCKZ9b͎3Y7\4tZ0p2 c Ad")!Hq( ٭#xVpy@5qZFY3^OA&&*GY(6#\CL$]Ԃ'R^*$Df9Rk<>^YZs''(w:gN%t{d=K}z-*/j{vvd!Q,{1$n;UMOb?Gv==@V* $0'Kwi2 \=MJAW pE{ZbLy:T}=ڠ8ɒI.\cDI~p87gե&XF >"숣J$D ύ,3_Άgrv|،U,Zޏ1F\x4N*2op/[0̿a E<[O`9 a'RR)n xw~=s ͗Z_o8oC 0sWkTalHE^)x&hlzFѦ#HT.JE1\TV(Kk 17Μ"k.:-wH,2Z (<2&ܨJI7u {1֟q(Q=r-dBVWZ@A},{WY\#PZ}OhR^!\J\,TWYZ]+}=\T1'vo})ErJuR]FfP`UW*K pR2ѳW1 AqUXɽ,.}+v1U5•f1GpVc fqҚJIϮ^#\F,.ݛ,-| C7_%\N{W(>,ߕKp}ˁ+ȮgKFi+czOo6Ӥd[pŞW^I!DwwCUR"qSo֣†J)YZo`g @B+R8|aDdlo-HfJFey|wwW[2J9DRI$y#rZYrPwp ES j"<QĀjb :gDB0:QYíNiTˉZ瑅~ƫ{#߯/_ZE$"UA)}hX*:tpq$)*6pDjOo9c '*Ԁd| 8~4BR>< B;ʏcq$)j#ys eGnD@ : t !9G6pؤ ˫Ic4(A?#&p_7&n:߹JG͖%Bh*M%gtӕ!ݓޯ2\dv[*hY] ַfƭ✐駱2S>O7/q8$Pe膀-s?lןGST)ݾ]JS*UxX|R{_\9'~ s2*cڣub5 uAlJlV7(nזض%P(lo;m/vsrSe)4s+*fr࿮eT2(7w6j1LʺeP産| iĦy5o4ɸ0fp,!}BhrU:ޅ(%\we)[DW1#Ϛ Mw1MW^8wJ A]m _fAP)P+TRiʏabJ4.LB 5WCr7PTCϕ͆}n'/]Kngc_ Tzxy?pb혷1[l9 f@s^XuӮ[eg"dt]=55'RǛϬQ] 7^٠[2v8n]T 9;j$ԞaA_8/,BZagE@-PcvPh(!) RWĝkEyQy&d[kgm.*}ߟԝ~цhw!]\^KL^+SiMW,E+ȡ)7hb*%D(~_;xL,4X p`oXLap5r+î3$ ~}yx]t$lyBIF$  B@;ڼ͇\ӝNVVr:gs茎 c| JUQ}rJ0.9@歎!vaEϬU]V`ohUݿjVی[:W By9_Z@uBXx͠( B\@C !sATOzw@Me7U!dm႑\`  `< ^7ZE!E$Y,Ca&X{Ls,a (aY$J*k00\z(Rq!c"b$=T'f , :/{ Eu}<􂇚Z[ ;jbCMl^Z 쐻( 1B9+?1켎_[r9 $B0:Q3 7vs;.,qJ˷CxQRNEQ GcEey[0HkXz`rGwJQN Ns˿.U;@WZ#I x3c^h(ϡ*W:^XRQ܏R+Lȴ< " N$c ֠)#CFZ#7qg;zf2u(\ :TE4!D-4rN,}hœ%g4٠42HEAUsbFѰ #HZJ,t٠$NP!$818CSjrҪ.&Y|Hs؅I~~*|(88/2Y2#m$XfNiT\!G}S5C0dGcrt(G,SGq`_HtWU SEb:mzrsRpgU,8V:]G.waRy?ǛUBjoJ O/ǫ j41ܠZ= GF1+6_@'gK3*?wֽYZïٝՍwWӋU>1Zs\a/fsrnW#hմ }ᆋ# ~˦aX0J7X>Ŝ8hV kpp3r9f?p2s/>}|O8SYHER>Fx1ikho>47bm f\#7{} q+0 \1秽_lUx䍖DDWC@]\2/QeϺ]En,\B%B ;B4 KB#|fSϡRˍsۚxF-&GqF IGQR-qm99TR1H+is[u2."KoFylOu{]k ka2H! JhiMu J tF-)Q)\zP%FMq7Bb(b _.bo^F R*>"{SlM/(!V2G_0aȕ_!޲Tꋈ}{}qcC6cX3M.(h#vwuQIytRJe2gK7}ٗζ0<%V-&9nޠg]n~`Y&?=xwNwCz#f<ᑮۦC ?<=y|K'=t_&SeKR _ŤMf!׉7GC wJ=Hw^-خޫm 7Kmv@-9յ1> yJkOJyoGH\\)=r9VL$9EK);$"!^"""⾝l1OAQd*T?9Afe6m :a }+PJ|(YT#oTQ'IF V U] a?PKa,I6|yD4)GS<gqEsUӿjS$į'!}#}Z-/}3p{rUvKl$f\f"z +)Pl f{<ӍUh՜fMfQUaș4-d P|]̹|N'IJ10ΐp6^(\``a^rCS?]T6CwyRn\> o-c|mdz*c M3*TfqJFB:٤ „e&/P [=*uin@4 F-a+"G(sEZ$ǟ6'`]E B@B袊*E&8E&hFChT z)mIY%~KyZQ2,旓+eyyyWSp<)ΪA6i]*aWET$rvBY#)2jXb;? vQ9MbhGS5fv wZ E9#ARWFښ*)Y ,VfD!Lk[nEqEJ@}Պ-6Dk||<"LJrY0[1& %V^RhCJFɏ͆y<ؔ/bZ^:!.~5ӏq#iރ%i}{Cqr0Ml]OqT7Y }Yq#f'- B}y~zZ. ?ny}:R:&N4NAd?Hs&GrXAAk)k!Rt58M y$^Z$gfpNwȁVYΎy+#y)KQs) -=( P2ȥ/i:~;']ޱ? 5`1 Ɣ(5+|&)#jSC ;XlYx#Ĥm HZ5T1R\@{K^k֓zq%g 2,KѺh4мaJB2qFFk ʬ6` I_1BB)C#iaRA[0Ii St`4% Cՙ~ҀBlZ Y6֒ES(6D*A>\F(KB%Y8m'H90:k.KCLSoU<4v/{Њ}TYgDL8]_^.QXgKuSGmOjς z}yEjnR9'4[f24 ͓eh~gS uYݙ-ST{TXm|vu"SzH]@0IPS& MA: NkZw#<'vm$ʃL5$TP)1 gt6"]A{4),2:{٠ bTY1[l$kXW]MYid{d4S=CbGbɖ/x˸D/|>N@Lп^Ѵu2吗C^6(/2QT]t^`CQ K[J( N/%G*b$IEB9{[YWh8Sr18;p{(sv}}e섏k+;mwq[U|.L ɭ=Y'nw[ћv(+N. ow> I5jtDWWO7W?~{zgWOs<4W7+'rZo?}qにl\ܷy1~Vw ?6s'n.&s_KGKmo/5׷k׼C៿R1vXN܆o'\y`'[?p<0y`B<숮4nZ ]Xut(tuteNwDWvCWWI ]1~/+F9$]YZ!++qj>9ԃS.;ZwCW ׺^n 5JWR+o]1`ڕBnUͯ]1Jm]R;v ]=_i;]Л\w:a<XcX_ͥ)u3zy$4hhmlM:_։=<Ң V3hZ ~~+銗ŪۏYBq}M!,uŲq_GgekjiMBCMNGĚPSb>E vjߺVmϿ^+u5/%.'߲41ܽ%߱OTR H,tߏ\\GnM{ڵ&wV r*I)J.&h{Q/o^~A/hw=A뫘RqCC Kz=5m'E G'w:wi#ѥE@,Me#"|7tpꅮVY}t(7 :RNi + srbjl`"]ѫ |GtŀtCWbyu(7!:x]`l7tpBW֘Qڱvute]1`++O44 :@HM7tp BW6 2J] ]y!Џb z+Fd`f,?3vҕh]`+++QWVAt(Q:@B$:+E7tR7M"bмb:@nSi{ÝKõ"b`8֮ Hy}?tEpA^ѪcG銆=pͩubD1ӕ}Еvl}7tp}7;H:]1JĞsɃ~^ij(]1`'+Yj'F6v(Xj?DrvCW]\xZռ3(C+oA tCW ^+ڧ+FcЕٱ ^{K0|XTn+ef=|I< KKDyϹ$Mkm,=#Z%SU=Nb} 7Ci3+R%+ *+F鄮#]AxDW 8}{< m'OWH1kk, 6S+Z9U کjg/էNW3 b~9 괘+@k޴CyjU*(-pZ]nT9whu8ubFCWC_(clzzz`28o8q(ÉU|]Ezl GCx۫Rf_7n1hѴ->lzz!~oF^jڽPt3 f6 _yy˦hD=uT ۾_Ff}F7 &]ӎXC%忞펕r=߾8y5_GG|2_z<ܧa&qCԮ^c{o*E(|@ɪ=+ #PQü|@ymn8 }yymf/ {_1g7h]z:6Knw [^ l~n^gYI=#lqqtmf%t]ջc־l {|7n{dYć;DGh u2__k4uy>hp){3J%=|K()h=_Yts):o W?)ݪJB*7fq!Lm T~0v:NuvƢNwׅ4QC5Jstj! X&Ut(Qom΄1h)EL$?'{j%{(*#i Rڎ(9ٜ\?]mJ͛ HXùڊ> Vv@I()wFJ-pϭ? yɌaxh1Z#M1ZrV>ofJdIM4Yti:F,h2 ѡL6"~2ji cYlnc0j!ShwUa}pxĨxG7@B+ЈK*W|羽*6G:SAd*X  $J!wysYUuP iH%5Væ tF_u}gtsaM nN';Yn)Ǟ\1c##$63f Ȩ*OM(- VRjH) AF5K!RXuݷn\ D@K$X'5/V[ZQW!5*dlzaCKU+ܜ-dT j,CAΛ2MC{X +TtgqJAQ"Ł.0͑&A(gIwB"=dH_(4: qJi0+_$u9kh 13?kad0Wz,IQ Btʚ9fdYk|B H\4l=x_Q>VqC hA}p aP R5٠db̠JUYa:iB0`9;vLۛm^iu:g^`X*f=f~pv]&!jhX 3 b{PT8xiT Mg%s ?:fFVW 2r i(yG a (hV( {By[ɐ$RQd"5B5 C04X yyt/1 JWLd:[nmG⭐C@8.̪d!:T?Qy'*SL l;V+.$ `M"^Ӎ?ldEUv!ȕ'X1OBfȈh2 A]Krm6@^Z@mDBMuj WH% ep7P@R@"pc2{"<ѳf W nk 9%bkVcCX DA5DC7HX] @N%m1Y5H#VF/C̓7pFL"|R:W|݆i63+I@̴LZU'J)C2~ȃ 8vGyQw{0qVcCe t׈Bj(crf:PF.7hVY{YdP55Ԁʬfz86Rjv|魩+ UDc,e"YIؔM@[T_h6ZY)0ZoS@4؀q?7zs?/fL{s~yя2Ɏ, BttsYhl) l\c F;H(jqHuutk昴58)knĘv3@9dvz4\}YeF٤#AI'? 9k9lzBg 37_;:)jV}b`C(uAAJ@25di3 A)P͛vƾ:Bď۟bDuN M~#g=CE7 F-)Ba E5FHKCH4pY/xp`\Sclci0Ih j hN\76xk+fnQ4 C5kփ*H]6 |tf҃d &S@rZx tmKzg󞃊 S/Zi. kJ (Bu(AdJ wbdzl]?-Д nFГ5>dNҳ֞fJOQ!,yJi]0d@b~pQi4 65fs˥ ڝ.!b!c-ٱP֊PpԅΝqf@Bׄru3B_]wԢ`|J{& ^Zry/ܭ yAEscM8ʺ`+oޡ8}wG!Z3U$qDf z U%hu󷝗b.~m}Mw\;CuDc;k; U<~?hX1W+:eš!V5o6p/j9/?H_[m߽8:➿zzq˛-v{us'G%&/;vZ7m77wm>\lS[a'w}z7h㮭nV>yRqmO:}F?_=rb9+i1+M W>GÕLZ WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\4\9HpЋ1\1ܸS'PFiO1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%_pJ$<όp̩ W@ij@ WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\5\9^ኟnD1\.֧7\e$1\=G_MbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J W>u0G.xE/zbw= §ݵv7*7xm|-).QÀړϝ~Ғ>XEˡ+Rцxt(zt wʀY ]1ܰǝ2t (sRK2v1tpb,Q:y:sMK*8 Rц<NWC^w.ga'ux|k^=[ΕU} oW:J3gϔ]¿~ӆ`p6kVFg8B#A G֋3/crHJQJjdX 9몞6Ԅ!\O7jvJD?Rl&u9' BLt E{uv 漜; RH/DBc>zBv,ɴ\ByY #HǗhY|gJо $V$|65V(W~_5BQ=J\Ai4s !4 I`Bfȅ^z[day̓flY~$lgbͤB8 b.c`B'uK1Nޞz_D8ًOMzzfܷplF-i'Bp (1d'MLcp;[Dž,pʂGА}& c$W[ [{䭅Zpűbe8[ P[ Z8tҨVt*+thh;]!ʝ@Ξ^]1 k]`KXg vǺB̶`=]@ v+$u4EWWUt(u DVj:DWٻwp9 ]!Zz Q* +|+lDg Zøu}:]!JK+M+ ]!XWCNWRꞮ^ ]"+6e+ &ܦBIZbh]Տ(C۞UWӋ d|:W>gJ{XLsi+A+ mGVy (EMFUf,d/\aZ?dٮOPKG8[ &׫9N#xZEw_6Z㮶-a,R nˇ13g߿@ 9<,GH|W5_osX~n׍U_Rl|74.媕?x; x{Ty½׏,x·wvwOU'n­uП͑?ںY{O|w3 Z/[V*WĶTZ_VJNj9P^ VF2؏~-jΏUeiWG])sšs0i\.NaV\R,CX=J n`Z,RxX3xFS2,95@sm*0oWHXQ獡X +u}\=-jo.ſ {) tE)xPLlBfs%hŒgrݶ *sB:tԪ5u\xϦO 4v2,F˓-)RZ6U `\bVOO)'C?wXb\v|m_$K3p+O~=[ p UO'Aq׶e?M ?Q=dt(χ0Js0.,i7.d>Na)ހ' vYR ^Y 5GΌD3x2 (|-GI*G=p( }+d>*FU`trJxY(tc/xp 'ZS:  )E/(18-"QIv y)g+kM#}{){nlyМۖá5fͭ_6oˋt# TT|A)>ouHLGT%Os,)($RnTQ ڂkX)%eiF p[(3*s{r*cb)`قR49 STTqpHifl ؜6&r$ sYpXe|z7M|8_myYŸ?ǣr36<$Jj""8X5K\xȒhNL* xRe|267< t\d{8C*@*f6 163a*IЦan ÌbCY6ڒ[JH0d*H 30`@1,\(f !eHxс' \ %f8 .8`Ts acpʩ^|lQk)8YÌzF],Jz-=\Pu& M@Gap=2LZ2KѬOo %Diq4Q g|0D<p5cBs183Sb\lJJE0/{^Oⵏ"eZ R bmӊI-!gOq2~sb[18e|(gm.u"Wk䃻x >;[**J`uw~F^j:wG?)QPCUяHP{rc2Ǘxg z Ȩt;մ{Z]<$"!g-6Z DE81LQDQ&ΜO]lkxԯ#épV-Jm $Y~]y-6ho>N9Z[jyhWfOb1zp%cPu)˙Š>]^k2z/tvXڞPo&qW|E:S d< Q/ ɉf2Xhv&H#TFj ZI@#h|QkX3 蝧cJ MQXG{7{nSzQ%V#Vwwmw@AmߏbO&&ƉQtsflzfeQ& bF&BJW&sĄx!|q{K==|s&OVIsɐ,Hh8A (T+=Ga(ii<0.Q?>k:>v8鑙Z/N,0@ٓpZs` [Z4K|ƴ~x^V}фCjEM{v#0@(_dCR$ .t Q)-#x}?(yFFɐO'ճ<97QFL{? ZZ%Uud@WGF7r9 ߧ4OWA*q=1w]x92FT~& UrK؉=~^-sѯhU[.VqNx_x7ǭOǣp3 q\7ՙ&s |8q5T<m)_AJ0x'sٕ N9meU\Ufpߍ+kfָ>i ?jXu"ĵa۶# HH@+ k:ߨAļ\_Гs * [,l{X),P8㤬 ֜r:)̧ע/:Aځ1w:ʞpJE}im~ݕbvqd1@pm3Lf&$~+vKd-Ynr YdX(ݰa-bK؃uD̮t=<-%,gTD)3aÒG@IA `lL #VPt}`_8ߺzq}é?{aMR,QەGmLj.* $gYd&IuڠӋi) 3Gu_3CNYwvSYO[ou'] C5x_}akk I,O nsz#/"^4M^qӷbH~Jg).&arˏ?Z vhaT^[ҌLzxY-YV^lTmtND Fxdէ#ݻCMuWe-bφr^/?7o䃦;,ɮHߙ]LF}I ^iu+͵&bZz h7.+˓=KM禶 Yt^P) ϡR,w8++*U^"d䢷bg:J f ԃgֽ1 d ^2iJD"<"Ԫ t. `L2;.cʽs0#ZfgA6jQ!Vp}eL"g4C [֝Z\˛MFYJƵZyTi'YLmnlm2DEV O)K *O&ՇQ%\̜WTC!$u|MF>%!6tĽk݁-\ĝNi)h.i0b R&,O6Ι+2"XƵ֢9B7Ks{-IZyM0 4|F+2T{n ߦE>u[nۣo?FGϣWBڑw9йi.CCv/ X_GhBf_ƵmR`W^4[eě/*BPi` asfLnTQ.XeBL*@I楶ڱ>V8l?۫ㇵ-0g&vS-'kH^)q)b`ڔb0D9F=l  e9WQNyL{H#RB(hx*J!H.-]댜-]ÖmyDžaH9(.& [Ebt_n-*=Un2%eD5&I]:糑2%L :ey9L,h/cADƴ9R9#bP9A:# R?jz9 =d/.3sp|(TR $` N $<̲f!ձx]fίi;*2V$VGҋs2*=dRK-7:W fUz||@mCc64V, kw^j;3a s{ޏaΝ@ڑ9ι(PAX73Z9SrL7981J˷a#" @)CZRRN'!GH"ct Fג9F!YǺ3-O^YzeicDz֣ s/Fg$ːb]05U A9"ݔ3h=x rϑ(W:]XF^ ŖYId`Pڵ98hA qDrIKU8fyݗ$}H!ۙulTˡi{$΀u1qd \4[,1~?߇f!e&z%6P%& )8X-!{I;&ұj߸e]]7:&3LBW%ss4f$=1O8[vsPj\pݹ܅o/K74J𥦃V?.zZ5=nLo1\uf|?/_wuhlNR.eɠ *XiliPcF|<|G8H@10K}0C\#73zy4xvNm55KhW`=w]bV{e3n'[/i!<}Ai4+pn0>v:u1]]}-剼ZP|-ƤhJ$Y2ZѐoL jajjɉi F 0ks4͋i~MT6w7/^f71FZ$}I^qd0~՞Ohi k @[GuHgmÈaifuMlh\OVNbWcφ&g''ݣ.&6j\˰v98mOBTA9Fjh:ۃ0cBWiJZ#`a$m $,ZVgOgH^,kRH#t t&NƇaE|=AUN5T<˜9I*P-Hk@[ @{c2Һ W=v¼s䂛Ϧ6@"_mxȿ`sn5 |7O߀ġ0%Ritf|HK5Y%8+/*1+7_U:ֳ ἞Cb+1\˶ uҶ"hנMa#'BZ$?iNi]Ҥӑ:bEVQb΋eu`lcx<1G1wJzrԴqBxTGdWj]cI) ҺYVʗ|6PA=DV{Cцܵ 3r6Bd1BuJwN<`*(w:z<˜gOTv!C'>YO`d{=*6P&]iH1UIT YQPDTaoH| LL(͞߶ i*gfBj=z*H%gT+:>o!WRP-T TFto#(nU?~;.M9*B J%^|sramoSVGв:b{ʟ-ݿ@,&O::ZEmKqBɫK\"CtNj>fwb6*"-)tW3c[Egm7G?F,˯G]dz6"O&L6􅷾B>%$%x/{xY iLK!lQ$0;&íθ=&xesu /MJ.%@M)z'H b&s#w;+U^2KIs6]#gIKѰÑk8_n)xxj-Ikpi5tΪBtCu솮Rx糺ORmM䝤McST.~AB.ZWԺi]5s4͆CjRˊZVݺ}w-o|nߢ畖wx|Gwٺ6{~<ۥ26_qtI+#rk.hm겖(Y<~ќ珠OjnnD9Iۭn uq;UUȂ4E QNyt^wۗnBwNRJjE]TVQ$B xxgM*5;/ 102,9\&j:sap h8FC2>.T˯A'J|24'Vl;4{ +w1;iڬD'$8t6 &iK*\Y N.U#0/FnԬS쑎,Z]DZ*Og sr!TS\99VqQs[*g_xxNѨzEY>f)C꣸j*)L.xRы>$*@JX RV!;jF.4@&܆l (UΣL]HHL9Ѥ,93!lvFN" E )k^HET5gm`T*vFf:?r].˹2k Vn߭8rҼ_i{90/=ΥD+]0W '« >>ΏU`C%4 8Ī{,5w7'|pM{ uvT ZTVE! e=z'(YVkR+,'دjèa.W9hb=͑x+wz'XX$TcK#RJ:cfi>(SYd@}j> ' j)љqT93U/gFt̸pf] \-BWm:JWRq5ҕ4L>L \*hꪠ]#])Vѕ"5ȱ7tUbo/hUUAi@WgHWhd#*v \*h:] tuteE'*MB}+B+t J]!]Y˔`-"/tU2u*({v,).Ⱦh$uS꽫ϯGuƺ&1WG 1 ~|h4-׷Fj uY1Vq]?R 6Tj9G 7~jk|q4^X9?Fizm b6IWߩTFzPy]=8b*tXjW)^mdyV$S/)O^/*9Yhy82Oޫ kD o;6K2Փ:d1y4Z>6n aZtROOBF:V%*s}@u2Gl3ݩuklvqRasLmԒ.` w2YNJɟ8v=+Bm3TB_.抃.Q= z݇_wݩՊ8 e{s`! "O9 pc4ZvAh(-0cЂB zf5]?tU" l骠z3+n#* ]hBWNA? J ]!]!m+{DWXK ]\/tUBuB@WHWZHوc0{r6-F!#XoBJa/=0c 6z:(ދbO?VdGף ÖWw vmǂ?|>_Q k:l5Y{z4r^E..>iQUw2MT^$vT )_GT[şk;%S)Zϣ5wlq9|}Ic5ȺHHs1z_%S4蛸HTTo/GO w{q|mjm/iﰟe,H`+7bUVК·$h63k =+,D&l}V|XhbH5K-k&f9`hֲ74]Z&u J!>C\XWUAx骠v3+허طGWX["ЕȺNW  d5GtU++NovZ.NW00DtuJ+y]ɁUc̉O@71kUrEs.XTi4rA6VpCFuŬÁF>;kqϪ(76yS:zX~qW@y$KҠ.t9nbIZxDՉNZ[kY+ͻ2H"~_~ʿ+q)?FF*F01@e%ZoղQ.uo}ֳ̧bVj2ymr-?~Obm:?~(]:O=)&>" #X3a6C l:G_.(5]!vń]_]L J#KLZ0aAqMFA(a2uD!tSF%Qɮ뒉yGs"kHULWpS%T¿-^8.5h2Z#Q5"4"YsZ҇^G`H{L'S˺}\5lpܮdK5pV[)#.~t5ڸyߗd7k듣åf\xF]9o43!_V qhLxOx1K?X|EsʫT]An )U2snMHV;ܐ?zS G!J`i:YB-b}Y~ɔ$dlWf׌o>̦t|;0¹geA"z*9fƼ E殺$#i90!EZnzCLlbZ]F(gMMϗ7ɣ}9P/?ptI)!N2Jq2VU1!lyn+EW5ya·=iԚj5ٶ q42dڒZ\>dg7*T6JR|ЀE["yIJ7u{áDI@gw%9 ؃Ant7Pֶj_Wz(Ƞ))Je$hE+:E2 ?p!H '̆^ MJ4kczzZtL+cڦ0)Xnfrk>+{NUhj'FS v :vN:tAl ɳDDv a }hxIjZ%Y8Iu\JMǴߡZw=ܟx̰č(͟n^{a-@d2u%2 *P3jt*}q{ ̶m['çiM+'uA HсD-MB|'dY`q.KYfZT,Bg`700M#n!9%W|X=RS*Kxԣw;5zH(M|L>BS:^˕(zZ)쫹JhOI-K 0<3Dg2sNx. *j۽A_z_tUtK.:P!w&8kwץ׏fdLb.Ve n￴nVwwO9Y;GZ6ԲfG=5yWP7ZtG뫻9OvXJU:CzG@\顁sۻ66hi<vPB>^s6 ̓; h+x h1&fΉdb*)^-< gSmF،^Ep[9+B G8j*23,r@ ZﭪKFɍTY,L>4NH3p ,2>3WٗVM}ic3O?9h{V4ofM'nofm#tGlbf?'I̵L1'4I$/.q&hJ*}ڲH,p*,f*FS7߁<s|$Zr*sHU“Rjע&~ ʳ=.{K亽jkwYm)oW YF+a_AE`zR,:ZR$i~AEJ>4=ۺSa-s$v~Xο0op|q"1EkL2J 2!2 $uR<^)~M \&p6J@Rk 8"`M̸U>eUdt*d E^*$|g ȕj췗;\># Yqjc7 wMW}}n`uwD}%sI`^ym7WBINfQʆBy&& ;ݦu3y k+\r;^'Q1$ *&R8ޡsTL$jΒEJ$MF<1a[tNq9=a]rn7q}mfo}kD7fl|܀sX$U7J3&+[m8Z\ Ksf JĄ.I9<酞y9ʓu2D#!Y-D˲r2%lDA[PZ2J& eT~`_w&o^9^+A'yHIz# 9S6 ϥQ5T֟9.gާ=:{(A 'Wږ8p]p!+ ^k  ٞqJdzB1yixR.GSI%]H00~L)jRbw{nJ͠p?JZq7".* / 1¸lTc'(QJdM+ӿlmTTr%y1Dzbx~oFJPU +뫉s-)Wv&çRHd6hM/8: \pވRdC HY|^l82mtM]L*[AWd1@JbPZ :( 0++j5q+jz?o i^ҙXHFjGz\^|q$n,X ݀'OdMA^^f\ƖӞ]|YW y9t9~S!H|R0d`ȔQ> d1+f8t*b +a(ƞGD\,_MJ4ܡd12.IsLEc+#v5q#vNp<Ԯ6;EmP9`lFgi*qHzhȞgڥxUQ`4$ !HH'n&HEY!*k2Vg?F:س bq,"ʈDqKVJ8ab B6=J[Q^d]Y`LIqRXGs!ZYN0#%SSz.d!ڧ}g?"~Hb]|fɱ(+pqŝUe19,Rue Ӵvc` EFd"Lٔ]\/x \<<6;CUqxxSpUg]#z_691=7D?P.GG?UQ!*=zFF &53VB3[Oy%iCBRH`gƏS=G?q09wmJc1$E6ŶEѤ[r%9sqlؖ#|^rf~30!/r:8 9$P"ct Fǒy{F!YiBPЋz)U1-mȹc\nPDTij!UUjR)M aH+mgoG{Xވ MSɒsŰ,=ѝҙ?Nw3uStR$#D+SNJtуh=x -Ғ*:gg+^:2cȔ1J`A2ZPG$GT>=Fw_~l&M_("Y}Z4(G42GJEH6i5K~? )j2:qU jנ"UM>g8~H+:c3*>Qy"r$C8db~0C\]7|;B{ӽC5GP"͒o7*jr+]ZX0uFzD /`TZjryqѿə7͉g遃E`vcsbNܗ^?>Mޭv9R_͞#ןh $l -jFnF,N)4f,Z|?>EΙ՝Vg\ꪾfQM Cyt8UW{|WcPdjLONWwzDo~~{o~(߾>O!xxo߼&7zcY)"p~GVެipo~v.h vc:a )W}yݏFDr5"Uu?ޞ2Xe#L\vJe^p}Yj,d NF-Bye4q/?H2!9# MrP+ e[d 3.]\X#8vΆȫ О%-2=Z14E=b BR"À(2);tk:lTĚLB$2]!n߹s v'lhkeڣ׿| a<)+ l⿬ApL]kl!rduzPσ{~BkD1t dEI_!Ȳf!$9B0 8lVVs.fEbq $(74_LvK^JG޸۳LG~I'V u@o%0=\-ǥvVaTZva1ECf(+T> 5oꟍ=?1vSt#B}HOC.TO^|7ssU&hdT]kurU㚬&ߟ5k`6Pd{3>GO TTMT$$*2|nE: CWD8?vے1jYfV8Ʃkkj^T1?y1 q ) 7eLwExD8骒-͸ʱ3ǘCfP࣡#Rqi>rhbF P?-$K# *A`:7 ;hB׭yubS=Bv .]`f/SthLDzsYd)t\2<qhdUq7K!1\b .뻣T]ߤc#}j|՘+:3鄘xݪZl)j'cGc"Hꌻ_DWC:iA/߻ir^:Ny|J/3M-o4| \?N:f Vh)P&"Με)܁Km=t I.7$кB+%C6N`ڔj0D9F=l  e9WQ.Bsieo؜I|ӶM0Bȳ$9)\+<|6Rf:ĝIͼ<"8 eps4hȘ3Y*1bD *#gl>hkܗAT',KV̜3p&b)/7ȥht!<a=lDD#XVZ i/νTapJ-EV_qd7$6A6mC7Mf(n3cs5ںP5gD^ v癠2<=OFI/ҸhcYRz fVK)=! B\ 0BΖ';k_|h{v@Z)u|<ܸHEUvH뙐r& g[k+Զ.Ԃ]ۅJ]/p׶E,n"W\5P PL'q8zEH̒JMT;4 yɀ ƒSJ3Y;Oǻ`<4$Z} CCJ"s? \b~T CbZwbzݡH*n"ZW\)E\jtqUЉ(eV-W`F\ S" wP)y'^4+"05⪐B4. ;qrX-6R~B.B֛. /Q\)j)1,iXuꆣ2-dCN e=zuQ-T];9;nm?dedP׀(Pзģ'U&]zC Mȫ2ߞσioÏߟ2ݪ)U}1?wHBܥ4Mc1ȭ!ލI$ YQi2=c/HBN+ C&1¨.n⦻.n⦻77}Vsm V9ۜm6w/h,[@RVlkHr,j*@$޺vi Hx?aVQ^ׄD 6L t:vVTrٕlT%u#LX ZT696J3.*+qTR,4 G=ΚTaj ^FAba@eHYr1LX Bg.-t`\fe5rz]I|9${jMiJn7LU_˖X䚩{jbkEnOUoI29plLҔTʹ5@\G`^D7gYI-c,W#,Z]DZ*Og sr!TS\9s8`0Ϋm-=~V N=Eh+_+lv]9žV*؈bz|xREmmQ5 x, Ł΄|R6!4r7H&ZoB26dAѬ,&me 2#a0Ḡv+섍GE(\8A>2ּHH5gm`)j[s; 9'3`П.p1M ,vuЧ_l`l 0v%Ve3 `[1^}&*;WcR-gC-48j@%fj9_K޸A+".\`HlI Nb- \0%f֑2nMf !4Ҡ/se>Y!HPjz 'leT_SR_K/+| 1PVs*wi{Q{?({͕JJ`bH8f?zS d8ϢB*)H`Fvքje7%&#H$;zǙ b13'F3B%!arTp)iXv)e&r m d%E aƔ jI;+[ȹ#5|uڻ&*ޔ,GёaKέr YԂ[dagLBA6g0Ih ?w=i85 d)L+2;- ƙ ԀQ472(: N9>7yA[t9(`A 2^gj_Qͧݻ0Tr˥n%72e'k-{DI!$6A􃧛n¥KnF:6dGW'Fo](eNKsAv%؍Y,&EO}=fGyDpBDn$yp2Ƥ-WT.lU5<8QyDPU+|P9UDa@>o.Q,NE4YhʢgvKfB;'b c p$գzܒ*,7Cюb3T퐅C~NЫ:/.t#lR5MM FÍ=ټ.=iD8` 6nm/lDueZ-f6^/KCEJ)*GV1$\xyCRQ,$$xzyO2* 2<Zb4c1(sIUv )|uΎfCHP Fc%#]FK+%afWuN\̦f `1;aöɉD<rIL\IHBdD9#*L(4LjM]1\h|jmwơeTimQo-Qڦ+%Α{? GSL&HKW@% nH ӳd5B>9̤Mt"ċoP9,oA'U31fFy!֣䈷<0I[6I~d/n6Β8 8dw>O {D:Bam>u[9 br(m1$D1'o)]'n(ɪcIoÆ)Z\ϫ-{"' {62OC*wwuofl=o?r#9|\`_=]u3n~p{nfw]cvwmmQ_>ȁ:zTkEЖfX"=mh5͚f1^g3bS/BrJ&Z%$ʨGJEWfz'Ð{8˰U(,脶161a*HЦtPq)q#_a j6-ڧQKadQZ%aQ2A$!7n%p&y-h) 8*C•-+ɬ|ޠi22CK5b D^ #bp$U&o)81s4.Oǩ #"qDĽ^,Jz-=rI""9 LvC2'sq(k[+hqa6)My.2x$"j 1TC9~=bE. Yj:H>∋f bNjmuP.5M8q VLja)(#.fǩxX<1d V0Uя/{~Ę6*@j&Mo '`_x͛Gn͛wU¡@}C2Zŧk3Ik<vO{רdp,_ec@nĹMdߛm|ᛯ6Axɮ2 ."Hs4t~avamvXvV⹻$rR5CͷiH̤ vTwv0~g}Nb2i^\\v` ݶ'8ۍqyWӛ#T־lfI H($!VV%[R#reh$߳ztNA73I>[΅,X@E8HI dtO]قL0A)ML x.qL%ʬbNА 8-(&Α1+B o7͢4(trsӏx#7 junhݙ{lZܞˏrY? rHS}$ (KA* ^k{z/trQ΂y.o #00ID$6H-@h%b#)DBB 3<Ej WE5{.9SH_P<>outzb?"n\]ȏF&>c>w;' 칩u̯.|́Qtsflb=31gWM"ŌhM,R hĄxb#Bψ<I*j4 $iE bPِSeJ(Q^y O.Kc-xaxZ/Y`>G֧I)F;&>wFz AE7zZqdP"E-?,`x!  x.&Igv&aPD!E@D' h'lLԳPL i*Q; #?OqU97Ke)E`A+~M$TBx7v^_r y[ZrV( w몕_<8-GɈ#ksLd[\V#yw䧻bxsoGm&n{5ϊۄe*q2Kޅ-fprۭ`mh͗K>dES4N_M>lTk>pI6?7lna9 ׋ϻYVdq= q};ppP.bn_jsLM08^_"Aa4>N/?ڹ*Ǎ)I}[urM~:nz/?E+#hl1̺Z!gO1˾+~b4q5uj>\t EJ8rӦF[Onb?l/ ,7iX,% 9*9Y@uT}fi݇315Crdbߺp3i7 vכgiK@3@GP8bmgSMy4ٔȦYs[e˰BdS=5VN$RA!nU<02^_a\N3m]sDh'Dh=?xM˓l܇}[:+vVly xυrT"a,SC̳Xnk"K><(68%n$yy3!n9&>B;0/d(Iyo U.NTm2 JJ*Gcϕߔ9g_&J$)+d:K=3vڜ1rOzKƢs@eT0IPr1"L?{WGJ/vI 2x 0w1ӍFk<傥*Զ{1}uIVZuJ+ KYɏd-7@F[ 3僐ZRYiPxczgBȤb*%Hy!GbۅZd: ;畝cɈ\I^ǻ X\~Ns7~R7zS3(J/\s#deR^:o & ;%Eo Ei?er|W3}`sY&0.G%xoR8ιȣ8NCi#L=ϫч;W,vCd9JKJҩ*JZ{;賤,7P4 cl;K,:zôx&UD)U.cUH6Fc2E*zA 5G_u`y^$S[BJ\k)r@mƠ6(5dkZBHJȲ4OmcJgi꺰}dźab^Ӥ(uoȻh]LHVZDDX26M+'$-BQluRp6h. :J!H(B!{A4X /v..EַqLd20BBW Js)Y4c8#Xpr ?NK_&27">>@O˷ǍZ;|,\U/ f_JqՉ73fGH]N0ףXG?w‡aɈ|xa#%;:Ӵ@#7 h:#8Jv8(T`=X.1ԁi[,4pZ JYԻdcJf7UtC}ïR8'$~!^_Zˏ-LJCH-JnW>yѺRj*wMkJ݀t^_ϼ8ݣ)JqGu&Uv zӥj׃}*+nX^dsQUhrP?d~xbWej<AҮLW*&Z<>4>OV5ɑs"s;^{'Ƈ ,X`(YlJy6-cv2!u d+h$xLyx  /D 47")L\G&wIw]#HrdL0ȺU.wN h,,|"@omTDlTƷҵg| BF|%kG/~O'xԥE~<_n)vn\r읹'r渑:"51qTamCdvx?$Ked*a`v襷LZ^^V҃5}K2ӥ.$awIT2E'RXt6bLsU.=@Ia"o͍#1 ]b no`|^>4(PCx0H:!`pe}RƓ8-/%nw7b:KvLm}Ir}xz;|"v{L[xh{S3 :' \]zܾԲ3q{7C9t`j;4tJt{y0bOg:IA5탩G?45S TmbV8oOƩ1GlL(jD._ړRBĀy~O,ӏԯ…{uLwl:{O.r4~-:×+uO(NKs[zU;( = F숤ԀҦ0 MXsf;.W]atA{~>,Q*%*!d]Up;cT=?]JTOWϐjCtEuUj"vCYOWϐP2!*5t \˺BWV[WCvϑD4CtE%W@W誠Ńrޞ])]` ݱ \*hoǮ#]iiXUeW誠5骠3 6Lu \ޙABk;2< +{AeJthrϼwcbMJP4\'?~-)`ǶD^~0v.Sp:\JXU K\+6YϑTU.]>77hn1~ %ŭʎX E5zF8o}vƹVUۦKԽ?H3uW9aܡUw((4FDE aXa nRK]tےwͷ)= Uu,@^d8lK(#-*|/%2T'.K5ϳ9$NP|ȹB{FSRGW)wFqUI8O c^n['^y I -gQZ#_7k˽& @C.\d۸R;#Q,5ʅ]?4 NJ`wnP!F)H,9`$,iteA{b>;h`ܠjNɥD 3ހM), M rY\5œ0I$K9KB,y ,)4Z]b,[KkDyt1m KAJbi̜+ i&3ˠ,=Yt0dl9tԪ|{$H"~TRl2DM-猁hCf-t)crA#I㎃ߤ@M4*)\l>+M?g&"0*B"b{FH,HC]nvm\dPMdm=I'IeGT6D%ʪ Y.@< l,aY)"l; -pߨڗiitU8pI35Nj?'h9/MH5ZJl,S>J(P}$!6' !K+5Ѳ`i=Ѫ^ "aE C,3v ,t\1^+"\lthudAB8iGF uL]Lz$Μ&}*Y4:ieEX#QY$l,jO]koG+&-! ]d@>,&$_v  IH)q{.)QؔM Ǣݧn{nuna%Wl< u|ԡnGD^ @dr*j]v E˳ǚژ[]Jq)YеYec]Q?s%xYPUSo[ /ٰ֞aUhU%`}5 eo LI , RlG?5mB7BJJJ#L2b!]AАV6p.H#>G!HTP&׀T߄L 𯒡2UWHPcYTB2 V~nB AvEm%˲fH57]\?7(c)(|k('XzzePDdBEA[(2#χPʨ[sV ,,xu0wD ^%goSL3H ̚` UT r#3إ~.V!joSCAV(J892rVj g (Ez@ߑPIWm (ĩ(HvY]TL/J]VkD5ڽWQR@}Ju^32-(^5 5$DjeDi 6A j"b C$CuH*мGwU+czh2&m236~ _.ݣ]bF\*"u44U ULl^vNR'D̿h)`V9[w&>|nZ%=?>ԂkUе3[˲t$MG1*76B7#+J$W{HV*U2P (yCs ~;XQ|E ) >@E&rZ5ȼ`|ڄLktq;X1FZi(^Bdz ,:W% 9ՠhKP-"V1hFy¶e@TDv*BHvxofWW*̻Zl@ VHڳF& h%A J"-Ei*5.-zrDu 2俽 %0lӨ{tВ&ƐZJmmzvW׸ rhjXަ_Φm͹*dEU ԭGwH7ۭIFOB=V`&S˿;B("bjЭ(ךBǮ;"($'Mm2X4knk!g0E"oQ^1bp0) ,*1"@jt01Ru`\t%i#*܎Q3j hNmS; fnfR;5^TAlRԞK=e2 b2 D hs@:'/Ӓ<]{] v%D7lajlJ kJV@P Vp*-]QZش =W&Eiq#f0ZJFDStE8 8%mCɵ+F(Э_uF< \T"nm5fSMhpUf\"AC, c4&ʥ T$vBq$@Br6B!(L ΃rt `jtՋ6B3ǽy6[.Ѽ0}0ʐAFCOj-a(vDxH HW;勵_`o+I`7r{ r1ߤ]WϓUw pYtqxs q7ogj_Kȋ|NwV8]|)Ŝδo\7O@jEhustS| m/'i]Dm.&=II=zƩ2K䅑 W*Z3*'A8O\GZ{N @[ N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@|@." > \7wmGrHNSt}/N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':BPҍ 6 ^ !(Dv("A}x:@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; :ͅ#rG\ԉcqZ D(g'):hP@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; t:Nw[WX~aJ[Mmsswv|~Rf3L Kd\0WFc\R^I}%TKq)~3j'Vnp9,]vCqU܁"'= ;h[]\^Ѹk+Zy} &3,nIqv?kE)X @糫G*`1Rtp^^E j s:wЂiWqKrO8T 7BcOEԾ/3eoX%硫Ef(!d[$GƳ9XǃAzY^ޡISDN+zFesF y3õ9z3:d`TC KYzB?k8<,3%p#K}bZ膘jz-~,7V]~Uj)"qL_~ #|1jcvXMP2a/z&?|buh]!c&](gxvK75/-`'F@p}za}iP*K 'SގX0"NmTNW@id`:AԷ6A9"RNWR3]"]#S1H+kGCWֹc+Bu7_gAcy޺vK'#ihh8N y4M(U`>AF5ŘT%9"ڍUNWY+W`h9"Pn:S/1N7 KҲ7Wn mvwu}rs9.ϋs6D{/Ȗ=>G׳Tghk݉J2_] -yj;ΜCUR]HovV[UV2{7V+gJ]d3dELE fCcOf4m8٣ݬr0 GU8#E.#"tlsDeB{7}ZoFT_ _οlV*g۵ьm>tho_\-_y#]aS5E\3.,CbI=p.6o J`R<@pq*:$LU>p,L(fި$Ld*-ÄJE t88η[)píz.zF[`.i0b R#d RmHK]K.TH^)q)b`ڤ.M6hc$c\E%Z|`HDjpE G;tTIG.-Ckܽp}5=ȭ" Td\hy#V8 v7Q,vCֺC"x57Q "dpEHw&`*)1)KIz23 b-\@e97xA.D(H y)c d0Ϩ4 xVvٗ|-6'N8_׳(u7̲h0@gy+J+A#ŹWZJY? n@; +(xUGcL7TWF.LkW{ubnWd}7ۢ7ܱ/F? Pt3cs@q̹ɠ6_[r9ET*+&UqF#gPb Yܹ^;$RHqpǑ y!Iȑ'P7N0ȹ`6(B[Yz7w8~ZS$s/Fg$ː&E0X1NG)3"`1;tL155B37:w5m:at(F2ZP"AqC)BR 0>xW޿izFiiy6v+AkɦK4=3`]L\'Mi9K(m ,:ks09/2;9C"j^I F(TL±@RJ9d/i$C]EzKFd&i4Z dr,Hzc^q@#9V;enwP`҉ zOo?kJ BᙦF#?-zV6zH f;z8io&_y]QQ %}r m,N2,a\fbp[|:d<|fF8HD8dJ~RL{C:e8qYëٜ4_,9YƒwY `-QFōW4ny^]Ai4MecLV3\o>ީ7q"R7oggɵ#qGP"pv*EM5`'Mצߗd'd>F 0k}Z-Z~^弿fD-̅<}1nv9G?^7}7Z0Z -ES3Rw6=Yޒ4`bǣjg6Ι U'Zmki֮[մґҰ4ias,WƋFU).>gBz ,_8^=݄ylkni 8< UnHB#obEVQb΋e<b=bsN3 dԎc%.iz':-:ɮ"6ZvPR*u)/Mm3d `{nIF&s۟bhMP7WcJ3.x59 ['oƻI[,~2 6y!#{tM'LK^M:aW)x-IZ .WNXHjinBe [:j h}xTP9mElAQu[[;([[xZ-}N)Wzk1s0ܗđV x/NJVaͤ!Oe,Ry:H٥|Tk֑݁A ^J&_ DvJiO'6m!z}EH+kTijS p 2eo|-ta\;6R҂:#P3,'ǹi0q!mATV+|sramDĀǽO }mʢ sFN/ΆVQdPǒA^G!jua ]][).<%W|XYU;?vW[OJz:ob4Zjb^\⟪?6N'L6QB~JO>HeN'(`lIŒ9}N`,ۀ%Ii%n^\JZe =rRyOhM2GwV:6[Bx΄Zg&ĥN`Hav* ·[q1z= M0۝LW;ck `Ɠe2ɀNA:fr6=M@i? )=ZS U xrr~bz,V^*^AbYMs>DЋkr>J-byo9,ѥaw>X SXOI*[*5,^vPʉEn8Sv$(`8toP OΜ[1XIQ ?(aP6%8G(-N꼸HqPpDFi.ך/ n˥mwα$ܨih>"(F#V&WwxZwzfWy(݁$Srw՛%qHrh1nRM\5.~@Rр0CTuKjQz4塁?uTr^Uݕ/PVa:ouwզχyZ ,Gxa'>A~{SsUCvWKee\y۞۶\my^f/>Ѯͅ>AڵotrEl BK&TNFx~nbJn+i a&Y[%X*e+nv#SXpU8lXv!M?KBoi~qi0c+,ьmD/^}"{UeZ9. qVCygs,PX؄'g&X:4BhCll X- w˧{aza 5t"`7MſUJZEhxbl)Pt:g$B4$7NX?;}BM) (QJ2#D>qZE9M> i'fd'^?o|w=b0vg÷ @g0 Br,1m5\% Pk@*YRU9B:2< VJ Nl䌑xrwvUjlWUY=hǚM^mF[ܼuSjNɻ>-{OzgiӇ-DCՒXr'>\qMxl}RXiy!1po5$,C^QLHX Z$a&Ñ0p>fɨ2&YMx:`AZcWDQ 8 ⽫X*g+<+J d(fd(R,BJ(;>5a|㤀Bț 2D4G D)Kj{%vD6QW{Y)W:k]q2K.xov2w ZT]!4q3eK2rƲ˴t​žaѱ+VQ;V@6P}Ӈv|y^Mǧ~fߊ|dQ흊BУLfQ"}2>FGV!y=]}GIiz|Zy 9x3hP1)r%!*XUrN,edUvRSX玄 K81h<^Η{]K;5K{u9;N1=J:F.r1. }5,za"X_0Ya\ɠd k" ˉg2Y $yk57Uo Ayt)]z=#K>}@ YWkZjDtnplSx^ cCd H{\%CdHLjmĜ91a:3z6Bπ<ߝɓuBT6,hx0,+ X?!e#kc\F zbpQ^CЮi!h 8^=1r+TI){Wj}O#cKk p%]yO#WA7(mb>a8chP2j{v~԰GY^EFwdM4eH:Bz-&&2-sJ&t|wJaOBB8;vxE{XoKnmЊVŽHK fc)W~Ө-:$ޞ܏L|2}rǣ%TTr!Y"=_=d8ꢙF7zѢML]]j\s*i"yrrq>W Jr1~.H?qs˟/x˅Gh W^v;s:k?ɰ?Fl)Ho]:_ߧfit>of%wڣNHhCO9!mm55V+muէo#P?&@ypo4wN?F4]bZ?]cBT|Qn>O.q#Z 2Oa>j!g~k]P47ȝ)24\ĸE"fTKꗓw}3əlݩR[s믅nclmSǍ1MS7{_I̹GK!d㗓.۴Do7_8oq_A@\$uhuo,Bj#QzWzuj6m6{_whp5kڕްLθ[.=V9E\f=iȌ~h)@*3[Ye9%4HIvz95aU8Y4y} }UWqN_G0~O`./RiK\7|.URl4'h%eٿKFToJ6t |K/ڤY ?%BA\t. v1y_ky\c+K짶ذܰ[=Wٯ[CciwזbyeG- tj:-n0ᗟGi7{e~זYWu^h4>O75tԙr򏝻]H<"*[u4pEJWEZaWEJ`\=CGW$q4pUUGc]ifW \<"q,r;\)l\Z#+>*r},pUpU93+ nUǺ*Zv,pUyH)W \v|J3p8S]=J\kcLIiz3(Wra7bvhO>TƗ!y5/EgH:h2N_/(i+.kåxXnK;Ofs8m PHR4hˋsɟ.pͧhExyre)jGP@ڼޟ_^O]Et@~Js^(bIs!^cS*&"-+\ꪄ4Fy@,g{S@>tw\^Q}E4fYl,*h1©d8O'+8trNA9Wrx8EʳWRRKxLf^VkC/Do^)oYXsS\A3(knCfncF}\Uh-/ɾ7{Hm8Ys^$~3J9MR?>;"݆G/Ѕk$#NwAKVFM=/?RI u=jხ^)Be+瑯T,`}$舊t9;P 12Qћ%,|8vm Q^!)?oO3X^B'\e7/HF3ݲ,Zl>}hbGwߖF,6j5__-#(nOhJAdf)+ /zn:63kf¸\TU_o-J֖<݅ku,Ãymc~LyVtnZ7^t ޗl GV3۽\/sҲ׌vVIlNT:\33P(.cM`P)3lÁogѲߗhfg{\e"oVJȲo.m1L}(F*"LBxP1:Y.<{vJs"k~~A0!R^eh 2Z0me3r#ʪC'[/”9z:I۔C[_j=7Y][vWGrD(':m`2BH9(gU %NDA0R1GgQrI˓Q.d40JᴀlE4lv%2d`eX1(J^i傐YdT`كs=XΚsrǐ̟'fZOg tBAd#j)#kctJXBC).A $8SFh*o_*s?/"(Qrrab`m"^g ͫXG j'~Zx}"R(& dA0./") H5hu`v}6裵Q: OymO[I8>)"L4&kq6vҗT1LR1cUz&Q(m""L4wQ^P+| ())R>.!v1IQYˈMNSJIQ<ډuP+tد1z?5K9Ptn9%WܓHԌUB˜lL&^) /h^NIx%^L%S;XT*k,Zru4ύ{Tm-'y :FNRm tQPÁ h̠kt*0j̠4"!1AR?+FeA휜N!@Au9ee%e8bmd t$-[.'cgBFVJRQb2&e0oq(ЭT>ʣNW VE0IIJ4!(j $@/12b=*@âl=G+Kd#EAQQk@b-\,0PQf+ɱE"2Yel@26m9d=/mխqv/?PO reGC7]x7_]m^|Cw6/P_M O<<ǃjicR&a ٹEcFo C '4% 8 "x0վĐ9R]Y *oInﯩW/jUj߱'t|sq{=}l-bm=" >q_jԧ'㼋,1۰q_X<+m󔙢wmlښbfteS:BT^CsHE7E\K۹o!:2:(tBD !i {ÀaT ;R:ʹ8K.%dm0.EMjgY)HC(n4ikkbIIkB1SD٤D(=(Џg?y9ǃuz@!9lYW>w;ަ]ޚ5PfMkc1kmb.<_]Lr v@߲l-R_Fx9g,] ^KK{xutY[ )!H%ds!JH`PKDgrM9lQ A H)"25Y"?sZ` :v*i /sb:+KaPx YjC@6+Ʒ^MG"`bqwA<P? ? Nnje/?记yѩrZ4 T͙^PK.q*ϧ:S&P[ٙ"m)Q9'=X9з$*v?nڬ fvXƓ8?`0G0> }s~a6׊|Q@D&k]wTHd}dEEOĿ._GeS-!ţ/0aMIl"4r/ ?u)\߿X<=ΛّIc96 yd#%>̛'_00=0 3OKڨWL0tݢM(ԧ>,f㯆{Z=9mG-hDۖڡܶ0c"mڻEbb] G/5ofF gI' t]*J٧.j~>'ą1SmADtI4&#٠XFpFHʢHJ.(0lH2 59Vcaj6S(&Zg $\cq$9'VeL1nz!mۼl/煉K">|a;vD]_-lωl VxV0h|91d%41@$dd*r {vMɷV.)]QƬl $%GR69YJ5#Y_W+Z_{5z<>•W3y0&yH &^EY7.4p(Y/6wRwcIVKRFtĶCLi&dл@d' B$!ؽg.KԎO8GemknrȘ#0UN>IQ' 5b% ̑>oFDp(K@ )F9$S6B╶H{vd j3rNhgkgM㌆!45o'g65N8mVv #j$%1Nd/t6 .kg)pA&6^,<mtv QA`#% A}&'V,ZkfБVi qƙЊƺ0Iu<}E{e[\<1i~&_b@o]Of/\c=KcV:!vIXocVSklvh3Q!Aï:{P2jSRC^};( \Dк\956Ж9݌;ڲ^7j Xp M)!(jX"dڒY0"cB>VC`א=dFtk(T-#r YGXbP@=fׇS_jN'bFljDX#Q#`˖-lX)A1^d\`Y@ 'SѶ2j[iGczþqz8/5yJ3H0q jj.&)L;7#F|wylU3Xg+.9W/6>Q/z{$BDEzYDB!Hg坲T/z)Pa38W6>֞Y*a %ُϔi3Ya2( \|=37PdȐY_Z'ƪҐ:yh{%/̘q^OzDIZ4;¼Rw38QDUʊĢIR?{۸ 1G`dwA&kucK$#|&mڲLYU*d]2/ uTRØ^"lE",^0aHB%Jє{D ve%)D|BՑ𾧲TA%&0+2CMPg'L"|AsT|j%xPLQ$9 Ka{<'Fq?yA[^ :1@8k0IsgL^?\ot!Xx)% oH- SҧA2@6)xTǖ}%-Z.B5RȪ9Rr8.N9HѰ~^4 ţ>2Y"ȚBY)dM:^B>=*^)y}2^6\jscaEiz9:/Kxpj>XwvE5|vx?\ߡ{\ f6.Κ|{qO?IGU!y3]jLGxߞWhfAtpp-'?Y'i~_D[Ӭ%-?%m]P;M(`8';_2q-s5% DU ??҃˂T2(;ƋL*L Oʺ<Ζm=jQݔ\d\sƾaq^Onj\o+uuh+;W&[)3Ate3n94;7-;g[&9R5oHq`0lcYOeTff#vuY,͙.dS/H3r3U nųNb~+ Ѳvx-M rcTDșGsA XKќDo8'{}^^MG[Plg3ۢ R@15ޡcDS$(^u" 鴬xlŅ{/5C)Cy ?uSdž+Ou3vK(osNJ/%$)>07?ȸ+uyƉMidntg!g6ZM~ªyḄHth'/󠌧D ߟ{<t#rYK rYȵ;I9(G ^I99M7 B.䌊>ހ*F]W\Au/"{UW fc;n]wcjN8S۪:oŽ..OX uDX:B (KB>DxonRA2U&pW*}D¨YfdK _e }0R,-Q䋊2OTTGÐ2Ugm}uTW*Ȋ{Q:$Y8i O=ay=xoQJ6΢T}M1G͝f5[;ӫpR #xEWgڊb/~ٜN~_K[ Egy:::^\._*VDsp3?k.gϿj@d6]8C' ߍ:.Vީi]ͳk \T?wU75.( sa?|pzV-7]ri{oCkO n颭ލT,/1'*4+d-> db0T1rwl]sf$.>?:""Cz@Q?nȢʘ⇫ ÷I?vq~p}ԥEz:p4_v˸U9hGŶE518xT9J-”<%+XZFP <1Δ }G4}ͅIVٻy琄F J]V()5!űFLSTP)1Zm+{̔v'n/gX$D|*)bcֱYSlu' !VE]e~Fospiزi)\qK1yw;kIOP_-#!}k-dn#Kz)>^*@fؖzzH֠^!1Z7s_X4N.up$Rr]*BƆTgw/ ޭ.=ۚ/zz-n=fޡ%E[l{#o9.yMJ{%Y+(a#bZsXĕ:Vā!c <>#<>#<] &9= %Djw֧ q:'X  2J6E*)m;` 빱1$& O5HO93%Xȹڧ} Sҝ/^Npe^'ظx=vxώfɴ2?br\MK19 !lߋe*ɽbr#n sq `a [q0jՎè{V WOf`| w y\f)MQv MVHr40x H#‹"Sx4Mϱ()~8Hv ř\|yF~FR 6@>)UR:SE/B)dH\!g#2T?qzURkWٻ޸W ^',`_쇍!i)fʋ[94:F6v5YS]U$>' V\lStuU\eWQ<+uG+}JVE]Ujq*kkPW_B$g*+&(\U7TҀFue^3jUc2l'[j"mӴɇqs:e6&)i+kkP7߱M!$^hU1-;zۺdacOVw C/U0?0mRXnh݋`kbK '١^#9 49$4$7(PH#kŞ^iwª^ǟ&Ckޮz^b{/sJ[~Qӈ(~lk& P:9[d>H T){Z3X0>-twbݚ݊>Sz_aT2moQXU_h ෭Ӻ}kE^~Ձ?6RF j#9ۿxV|.Q]כ7mE.Xݝ*oig3]K8:̑}7&"El@xs#=+Mq7tnx7aT]'i;SҽowSۄE倨c>' Z)l `.AARj(I HF*gЁxskC¸*t,lpUNպ!m z D2:=eM !s*ɒN*RF%Y@z#2La)A[Iὕ!X} LL*KR{Md@Ȃd!${V]Cݠ|̰MmpmʷX3J5<,Q*lvTji㓈J-$2*F^#\C&h &j /%M9]=xDJ=`BPqdf7eOy)PDJlbN΃| د_c& dXTb'&i]\kLQ_#sly5^M<|?Ιeq%˻MʚE]"XRmC<rёҲDdς*6ei'[yvRd.Es%Z2۴X]Q%)VhF6hB3ZkK1ٲv_AaM^`]+$zOPQ #ah9љWAt&81;!FVza vV;w>ȝNAq`9(O:vwm8C]j9ɥį_AfmKhGô(x$h^<F];¶$vPRmeH|?oOgksU.HgL@|9Zrg~qoQirPe4aF~4wطJbTȋ_-'wwwfGgP6 M 7wfxx/L 6tj'l'|䓽S5OoFg洸z4Fv_3?'^/ZfV;":Ãg]['nxp1h 򑎺Q9:f)NyTU8^|f1?ZU [TUNhwQjU751]ny 9J6h*+hV>|0)rf :( RaV^ KqDm<#=bR5 XPb" Y(*(B}LKEv)Q$XN/1i?ޔ,xbӲ/#t5X5d>M!ܙsɟxrA`l~&h#);LCGtSN=vG8xArVUZ+HV8rT)u u81dD1"昵$&I6ڰ TT;-%k5@V+_D6HVŃps=Lbx2٬;{# >9ެڲJ9ϳL/ŋTv_o.Q `f]o".6lIdb75 )Y" %,,f.@\\5(jxŞis} oXF Rh-]lJ3@i'vp@`2&@ d# :8k<ΑUsmgY$?# :U)|*B Yl)&< DN,ElA2Nb!2:IY@g9c tPܤz=]rXiF_G__ktDH=#6a^V6懚ֆ00yޙ=$,M rk [Ai@IAI۝wPW0c:/~;Xo~[.w]?ؒݹ͏;> ,:j7$&խ8ɬK2t)4#KhM?c^e^=^k.Vˇ<9<~f|]&G߮Uwz^:~7|wD+F9xp3ꛧԲEg@c״^w wCnt;+Xnׅ2bs+^5o ,pҁ;Gj-l m5S}Ёͅ+ZMiO_RTvw_;Ho={㼞{ u]1lZ3g.i=2Rj!ǃYiǸJ=DVH3 >F7(bbcx$j5G/ң>dq$o4xe2 l5i&Rj@i>js1+ Mb0*eɊE`WYc.}bFqZhw,9+'L3IϪ+S~!,B<1ϩfu%լfrRR6tP+Ym-Zҧ۹ɳ74QyIGJ4T6mlNV{3l{:6N7M}'@DK˞K‘A b:! 5ʢJ&]\;*PA'"( X -) S4(SrFyPLΒ0QEȹaacæMgyh|XqqdOL]˞Tט~?fmczǐmtf=x.+%Ő,ɵ_T2$K3qь#o{tK V&;)s^0VK2JZ0YJg %Rd2+]V瞭|=|y .@h_>Ev_1,=BSZlUq3/xy}77w{.bKf d6DJzp!ƅD.$k6Փb)tؠedr жUUJЫ@6%,E(;'s+$O(N2K!;kAidRBXÐal*kho`\/o;n\Uaym> qL.ϓ۩;0KO.D'= ` l fOWIi0;^eOUSO:QFP HHMm8 8t꼔>qA1tȲK V2NuL2-JhpU_T@+6hj?"&р)05:)8!RVepRڀY-cKs=:tj^ }9(N,!l:TvFV=4ɄJPbv 6=dQ A$F*P7`Ko}ݍ-Ճ&͢˗'dhb?ծF (R&c`A )N){&J_YbE*9KBPNQֱQ$T(gK67rn(g׼ޯaʧ7ݍ(Ilb ZJgR"DZ$$EQ6K!4TJ1K-2@"K~jiDm] C$\d젌6!u@߳Po90o@ԊOss6gRA#_" F$P & j3f0,l6GGqlTϬx VDx u Ʃ1b6ҖXdMNR1cUjZ&^@nVDmv%=E*|гG^/nHdvY Y!DrPyhz9F)Ob>?IV>ŷ /ev4`3HCV_cf(RhPq2Ł0=O2VTjuȯqkpsVGؤjT]P!9d0piԚ)cz6C˨ 2i5@ڢZOWJ=ϔNAK9K2 N%^8༡b@_#3:Nϒ2F'79sq&|CCj} uusF"qRq jsU+#ÌN%"$[ӀNKʃQNƼJASGr[{wŠpD9sNQ;![Đ )l>dkm嵿-Ar-*tFT V!Dr1lA{}XdUB?9-N^FV1N/Ϸ*M ehvgCݙ;j'ϬLT:P'TQ,0&{W璇=KթaR3a 9-}u"/L+j wqS`h*ጩ")# ek aĎbN#Q 6 >xLQM.RV8PɕDK(A)]BSҬAKu~ScgU=BfTujc#JpQ/T닅/Q>h#>>W0Rf~,i@/^uqu"PYȝ\r+ꅶF,(M K9,Cl0s"c$TQzct1*!jc/(HiE, KY*g*)L/6˶$M YD3I &9[M-'kuhh*)SB^29L~<'ߎzٰւl>I'˺YgEXMPyEg>Nl>u'\6=⊉`M幤HTV&LtVQmyA/fW")eHfoGݰ3FkOb/f@k$sfe~n([$JW13OxP6$D3(8Sv[$ G:^ˣ]M "!AyqayO^yR.vnoM!G2oܟiAn"=6ᇻ KͯwnY{}}o'uWvǽdOڳ q}*Z7Qyef{׹m:C[)pl]asm{iӛ6lͦClYa˪6nnwk7Znׇ\[gG#]z;:JCz_57nMuc}wR6mG*֢CS mcfXi{6z?|/avۨoGߴBicoXԊ7/%V]JIMе LeUDDCv.L_Z ]tǍ_e^ոF٫N-MRx+EZVVXRYs5A9~뽛=uuXBgU9xܼ&̂ vWͨvܛ!ff@w^Q䐧#XԸivK;Nƈó4j `xyy#A[IXtIƊR+:$r,Wh^+gHB@5nY*R/l_S*~:~k% I1(P ["BA^[o^&SQ#u )袆і5>%nҔ,eHD<t5cBṡ٦?=b JYŜ`!3 8-5g υw!7tLJIc+}{5QUG2x/QОm\~'H{,$k,%A >{f{pDȺܩ 0<0ID)Al@/ I&h4 (@, TUKLoH.I,5ϋX3S!LQJ3-&n`z܏;y+ɇ{[bw~r.[jz<ӫjPpq*$XLUs1#Z!%+e@:1a23[,=3 ϋXLN51A`t=K: V_O}cE?JLv RTڳ7/d!Tijw>/tIE_BvHAa3JD+up\c3`TŨp:G%api 0&0#7R"o# e01|F(&P~L8x'n>)򯦰q'H]KF^q4^6qUr6u7iw W4{B\hۦ]< 6IJuwEuzrnz}6mo@@_ #̗F9c}SQg_d5Pn^=J} ˬYSt2yy?9ZϏx́h5}uھw5~g&ᘟig&YJuiqq~ǪнͩK@: _0L;%G0=tnh8Ќ]r,Zze{mca>wレOȯ"TO[<9`ƪi|;#4j<;-RaJ)N闿 ZŤtq1W{yQSZޫ+/ў߁mg|D>L L#Эnю[1s'])|1K]#ԝ]&jqz3locƲUP^=x]XBX]dFn5]#e8> k{ex N++ͅ:ǀKFjt: C!!1Xi"{wY(¡@W<8=(m#@GcXކD:{a^U|2 KsO^VRk]xL,T4X r`/X̐.h.zսN+š#vR>CHg.:C~6iJ 4d<r})Wƃe1rTTYv]ѳ }_WA}X : _7[,x?$/gKoaqy9MQW,/'?;,jj{w7ѬRA͏EWj/k+^c{}yXn,oGӢ:ϝ-^c˥S9y jΏd&DTW>=/H)}c=7o.V(Fo1pa;q2Jp䵦VDChCvsaU G$s9O'T_2-m̭ߋ1-: ~6EY;Ցn,ZB d (\\܊:5j6|gj0r0} 7Mo[}UWi{9cNreF`HdlH(c<\ }3 }6ƊhOjx* s,f\"X~RɓX^Qװs `YyV/o7b( pа ]{det1G%Þbْń9ڴU"ծyftg$gˠn)ΔS~/x]\pu.T*ŋoB a&bu\ٷkqd6d}2ߒ z-󅔷rhԘ^ӕ/[{Zf'v3Sœ9o`w@zuBVssMqb#y -^]Zao]1ۃ@-K^Ҧi#)KٔCnJ;Fd/m+W*$8c)$KS z'<w nؤ'Se\ Zv0ܮǎr/6$HeGel/ X}fh|Lri4ёy PQ¨9T,G*?F+‡ẘӆB2{̎^ݸ|3|impOc# 1w'מ? o*>x*82 ;W_FgotCM|'IJDe\8OOϧ]uM[rYdq]l@++.-CX>և,k}=ևQ{CÛε>JBN\xVџk{5#?Lȷ@Hʕ?zYn\;(T#{T-EX\ v*}1"fsSG B%&gZnĆ7"4FjB=5m4O\M T2ҫiklɳAچx*Sٸ2ӫ@ H]<te]ej L佺zꊃSi3RW@٨Lej>uu ō}^]u% ^`&Xg2?uU'T6z9JrAʞ˓eҺN1YfKxX|^ZUC 팵WUyǀ LujH LX ]/2Ρ\ My'ǸvV4r:;gjJ?NLu{y2G멝]x<RCW4NxVir狼\_o<"/67m|Yu8HҢXO2鱒WuEMQgb[$0l%}#$Spuj9 {3iU{d{5!=^hWE˭&R̀%6&s1i&cGĀ4'&{թ{CDrԃi%62 -3EThR /Jێ%3rvKZqvrW9n+ٯ=',5oʳ,U1RM\TgS, Tv @\k4 <TNGr!=mRSs!#{ҽ I$1>)G&c".Jc`0a*&JI SISj&_ @q] O(@ r@h YAq hg䴨e[9ד0ljv1]ii'#džMal0amSm>^51vwI'Y&r)S+٩_ETK$[Nc փiI#*9EL7czImXڰyݮٞ `X~#!H?O&FMV `tXP!ŖN`R+Ł 9@ E ƆRE#7('S1w9CO^ e [0ЁG`Sl*LM|>O.ȍv78Z el)BF-(Q+AATI1zu83b\$ww^."f383,Q)*L%uL0p! mdJ 0r$s ĩ:*eP{OcTZJֱuFNK9+S3-;laTxFp Tj &12-͵>$% H0f<#D:@HzT>~-_+zSbn0rrl<`*t>:c3`Fju|IoFUJKms\a(y"Fp3#΂Ϧ<}n,o&r`xh^xV^ ./EY쟛n_OI[u{AUDrydZ0r+bL0IThP`Veð%b饭v3|'{v*+|cG(c&. Jz&s`0^@\YˣKJi>Ic Ɩ @NA::}Ct,%As^ *_Y&x{GҍGJ -$glQa;yNQņ;:&A<"!JՂ#A4Q^HJw{$.@pIYB;_DԄ{鍡Y4*VF/7֚ ;8/ "HcPb a:^Р/wEn73 I+xBn,I'V )2ib;y&4J# P޶Ymf"0 ` ʀx>]WLqr:"D^p0%N"ݐуïiL,(NGQRqɂL d MZMuq)}Z`#BizB4Up|] 8N&CcI3b{U(dedqbH;|.KYƴ s1*֒(H`BX 48Sڋ+з mߖ`P|p`YpbQ|~̼jy/s9O'APs(HQ>Er+/kžcdc_pn)%UոnFq8 eB;T7FH/UKQHLkB<.hM$~Θ ,xGT:6]7u۠ GU7P*tq,ב'P!;Eq0ɥ޵󛖾Z_FFO [u<иڰYjIGnkxxscJ&Oim\[2TU L1U6TO{HF*E^vv0IH8W&BSxԈ[.!BjS_D HGɇ ,JRiDGe[`R\CxXF"T3ccI2&q"O`G'꜂}7:#gw44O)K?$cX[*PΠY=Bf`UjࣇO1 `2u6TZpUFN??:@څW#GGF{2>LB+K% gV': BwD)i <2Z* fcJ4I}mav69 6{X`/,celɱd'ίbKۨeY'lz*$гh&qZۏ:od)sOO}6|=S:h ;,&QEܤV5'EH,W.#9!  SL^ Ǥ)RKbjA`beE&g>Ļ9d?+1݇fɎ'n{Sj0۹X *d?+ *cĜ,iCy(,9g8dXYMis EbT K#S)ndnL UijJ5,63vj`pXx-QyE;,.nYk~v5eho'6̈]kne*i})5;LRRc)XRA=&AfEP^#CRؔ| I.L܎Bɘ2C :j0b$ޢ}AjcWuQ{BGj ;A&+`4!+6+n! V@f̞Ó"{0|C&ː(%I2 b֑*(ԵTa.& b+"ʈ(DqK"w"BV%A2>*(w x^U]N)+&iEVG6R# 1 (2D 2QM兲[P}꺚8#olj^u \5Kꕯ,+r7xJ{msT][iZ;10em9۲ H}jcW<]P]pv]G#lN5rܽš06q !U*}^My_3Zi{y"pCy7X޷Yrz]1Tі%a"F!}BcԞI' CیIg; +IO p"cP?&5S)?*8i-]tI,ATQIJQ'HQJ - 2$VL[r1,%&Y`W"y""3\e'G5qI?ZHj>Eݜ+8o|ns6C"h2mλ/M_>=y_:K FWjzI=)or!"e˘ҁByrvF@#/oiѡ5 b =$ԣn ^xh}?ƌ(F%-zW7)Ȥy5+}2H 0a|WEҖս|!Q|DD.iNG }QBТh~9";*KhV]狓6.|zEMWE,cqltz6.̍X1eq2?ZAIQf.'c.wydQ]`1Fkg P9?ЎxF6buWH~yuXb %:7g'~?Q _.yz<&Dik5}}:N_,B4tHcNIm)ҾqO曒?M-bW{n8=s+نr;g֚ƞ*|_l4bѲW5/nc\b&ş T3ώNB#@ȶ_4wnφo-*q,=TȜ;޳Ky6HZ?|wj6=pϻm pv>\z^lgSW:hIQ.Xb8t'jCG+ʾ창ţN]w-B_<4zh#Z9;ܼ, dS ơ)d6ľ1_+%s >Ai7xZ8g= __dP [ox$k"g!hs1c :0@:8L6ڶU21jZCo֚=1x]XL9=T2:ٲ8Hh\'7;+ƠXjTd0 M0)iMUNI:b8,O @L3holRhPY} ;;wwzp =ؤQPiJ+ZD \<3n2Rs-80Ł s]X]'Rji}UUMWng@]_J)WMAEк+bO (i3r!390=1UR$=U,KQUdVUitd$__øF{ͮ  ]]X܉sqo٫c 7bk 6'P cdNÓ 2%})j%C1#+=A'Sg4 28{du ?L&!ɜaɍ)5ӱ 3qPޒI\gKj'YYgw&vS$w]2-5[ʧ m d'yEU;3rE, CR'S)E ] X]@8[k4QWv"6$cFH#?^c .)['wV'Z\qD%郰BVgIrZpEl(2WۄUU8:=$ 3Sт3&@1KJڛk.Gh]Utu/}˶ jB玻FӁ,=Yz8zksGAΣPʾ}V)5"ٸځf3ޔ# \PxQ>sS:}qH}o^d.a(IZ-3* *W^epJ2-iDH]4*py/{>.SE=^YL"Jpz)zbwV97I&VFt%."޹4'EE42;gVhHKGI8Đ&WH)3 i$K+a"U]]oLf20Q+f 3bѓ;3]Bb03wşkJDxCK~X&s@j0Hiv1g,,'+1Om.4x2~ҪG}[sDF]_BOop $YVx4nJw3#tż p'׃{&*:zRRuNiqLiO4e*>\zvt 5t K&#|aI.˝ sךxG8G*噙`1p&8YَyB'sC?)b$;ki<ٰ՞?pyM2d*ҋtk,0+Pe%,s yaөӫDjv8s%gY5L؎Ua;k_o?~ xR֊le8h1 KqH&Piu+rd]:zRϓ~F4F"|d239PCa0IvS&O3kmou犻i|.f4f)'Q#-*4vA7&+,Td^h.Awx|Ցvc'YkIƾ 0\ɽh4o^Vf̪*yZ`#Y !"&y&ҨJN%Iw# NgKBz!D7?\\r0>{-U|~1xGeT;ab:KGp~1FdB".v%JHh\PpSڸQ'KuN4K[1^7ޝǻɇITDj Q9nI,-xjsЧ˒JQ*ɭ5`7Mdߙc\?4˼q{_?_Ѫ_a^|ZDf^},F 1 yQM@Dݻqu9r{wk20mFn-έ.b5ͮ1,BOv\y 09sb`eҍ|[4[k1W0p&!z->ؘ0fl uiY+KmQ)XVjGYL/;1Oh%Ã`p}v:!kjw,spasnh8LN ʉ+ 9'p-ns mk\_”~A5 c@}G!+WD4քX#cbRJ9(*6w?YnJf#FxC M#ZjNӈ@GHӌ JGtem(]!J]!]q4tv4՝ A1ҕ`'gh :]!Jk]!]I,Vx J |0(tutSrA+wh]!JRHW+w !]!\|+D:Q]#]45'ޞ%tuteԧ`[]\F/th:]!F@Wӕnz}Bޕ`rk]9Jt]=Y[qk`X_=&<^m-5iQ8{ Up,N-ml! TYyYL9OGյi; >:JBd"Mvr5"W'm/؛5 ; R:~9ތo5<ڋm "zlm9,ZUJ1 4nbm%4.YbI[-EYX&%#AɻPS/ܦ\VZVyWX0 ]!\A}+DH Q@WHW"#BWy "}+Di<1ҕr+̨?b.S+Di#+%-e>yWXko ZoA@ U Q2JSs&++ w(tute$yDWjx]!ZAD(x*m]!`B4+@k Q6JLK՛ ]~8]9 Pʎڍ]@WOV{iW󷗛j.:x W4( 4}4ʹyDWo *}+DX Q@WHW| W*_ њԄ:BA 7#Մt% tut%RzDWX ]!\j|+De Q J+.!6ҕLup+;+]!ZADɃwut>-d ]!\z3DADٵ+ZI f.0ibj}B+0.PtڬBXjIj#h۩ ese'R !`CjᩪgDw,7tp<]JMhc+f>\(Y7t7쀮D$1.T= nFٷuׯ{Q9%׃*L_#φ)XN;s WI,3xmGf6y~Orq3jN_WIvYL|T"Gzr2opWYi//d>8PL"<6yso"5cRgADs ^]r^Oc.9ɶ\_طUm(NsD;ngmJ;H Q`MI8+@i<|(S;S AREo؛ zqɴz+C >kd(=X:q0|8>4~G{p`4^gh\c[h#ZZ]ŘHҋ>%֏\#P-,B>-iq7mfLCdTZ\+%D.?o6*í%F`W^Z6HQqVrC|Q4Dc-2)Pt>M_ĕ)qUMyEKK)%DDqNO %$tی=Pt, |7ۏ26~qniև" ` j)%ejMHɭny0dONγxTqvA7o>=Ayf//X!j}:|}h>*pt_'Ӣ T9OI,D&NJseě^ue{# ͋ ]@1Y:@ilz{bĐlؗ OY,ZUSU!+MՎ]jHogĞdQ"in6G#+"0co)J~ՎTBJ&X|(5SDӇtW^fYݨ(8{tUmSp-QrҸ%/f="Wu%֖[̽HɩRfqHyUr6lrSOdI_8n#,&pq{u S0uE3][*oqv4p:|ۑ<ǡHllu(^qGhV#Z6NZ5.q*4@I4ˠ 1fivՎ돔jT*L q;&7Pߓ堻R_l+ u'QUNC@W?4b1tgѝ9Gwݙ;x@6cQdZfܔ$e"mI2dZdh.(@=(Rs(ͻj'Zُ;`Tmѻjb$Xf!׬zg4,/4]F|?p{TzՏ߾߯ X09sI׽~+Mze.yfۆYRvM+ 15N^ 1g~Hԛ* .mpӈyuSޗ&Y l&8 _ 66ۻA?N{&?:8čt6FbR/z ~Iry&5m&9(7T C"3yNMAXsIlDhm3LCE7cT.6O2"ڽx}2js%"H_dKT1O[ڢ> 3Tl\mٸep.{ 1L8ֲ+f 4,0yğv@p_J?&כut(9:f\[66+f*_Ppf iV4a~[7Yvgu@_|VG3d8 F[d% nGN4; CЬ?al*[:HVtFro<'R%rEpXFyba'4l|tr,HPrCN7upP>j9(вlNM3_OA'WjUVo L,SF% +upKOR3B儋:;Nƫ4OdڄAIw:& 7$:Ua7IJ* R-{҂l;"ږ&[ cy{CFnfj;cpt:]n';UܛU~NfJCŹ3Zf:hٻTV{<3~,xpKF;8K`;qaެ C\¹Dg2n閻Z/afrWu6O }uղA~$(j7 g[) #9 [<^J8I%8ts[GNvjT61jvQ2LzȖU-לup Ȗꐚ3fy T^N' s>z6nJKImiֆuJ/gX3ٗ>B{^o8ԭ+U (i [֛*sl(~,/~DJL>Gg+.v6v-Z0i[nIN+v{Mi F'nɄnQJcx&;٦ޓ6mdWX@P>dmTlRd&B )* u8}H<$u"-$}{Q_Jf^+22C[/k4~LToqTNlyP @5x`vոP%/d(*07 v4z{.y, 钗anc,/v 5^f^\Vhr_8gxrgyfl/&!sg}8ݛlSNpL q!eӲb uO<1w'_᥇@Kz9>YPg'd_Re?:>^E :fK4FzXֽ5k/ r ?er3Y\n|Q~Y(//q~/Îc\QGhH,N5E$T[g!xzxwBxZт O2 ko&>[]JV)XKca)a~# g !D1C1XK"LTq KD¤#8|kS؄'"{{\Gqu\>ۻell2*":-8[CλA vؖi,KU>_NmOIk'^IuS6֝5y!#YoӸ/]{^xq̻:80ʛyqHtpNƇ&>7p>Qi^AnE7]$^ 71g: 6evfN>$Ln64釘@݋_J+ =ur[~"ˇ>G^|X՛hޗ5^`uE+CCV.hӝiᇌ 1d3i1$q^+EoDe^JG4lme/.aw+q➝{"IqOd$I0N'Z*\F]]'=_=<$) qw/Β˦/.S4Ԑ3X[(k;5fwlw:Qesz`۝n{2MZ*,2hRud"-13#" !4:EXȘKQˌtw %:LB A}3_ K,Qnŀ5%EWxioWerWx+stz2rxUAKt9o'%_-{4an9:ƀMA1ERCrbdHZzzInYad94:*m(G[U8f&C1$FMꇘƝ^Ȉ,+pOl/p/.+=p7g9nW5|Mhb=я%=x/V ݻND$ ]P ISㆦYj!TO4%[Tj;UDSSGG5(1x*v8Ou5V ,sl8\!F&YD_^[/ۣz9ñoVLC*Ŭ/a~JcV8p9՛k-<ҽ]Y5}t^ҫig=$^~\ޤcF*ֽ{YRйqy+:}Y¸ӷRx2> 66ͦ6Y5*ϣ1qni㳷#e|6Oeތk/& rI|$B>lkh?ϐbkdgBy؛&/Z8Xe,셁6/@pq΋$$v% a/? !=Sgj^;-aJ~g2d`"6+PfR|zQ*2)^i|{DKYj"A31+~ l*dd~fxf/l2?O2VKhhP*BF̍fEk+LDLM ]8ȃ!F|nH Bxb@S+$l~gcTyc¿l(Ŏ>993}LlV?^3n/OrT}מguͪ@PZ{/2[bhDƷ(NHIĺp6=w9G\"zWM*hqr号Ʀ&I#4 0Rk3BvC-lp_rG2ƈ+"d$ o#Fd7S/"q _xU«^} |2Xa@LDYe$`؊8,9X)bC)菋H6&evlчrATY#BZcY>m0Y^&iCZKKږ_W-tȨXZ!RPSAahQq#H2[Lŀt>(H:B-ܱO3 tͺNND& Ѥi A  F@9Iʳ1_sRȏ _= @Fʲe:tIY3et/2e ,8Deѧ\#.A9o{7B!hbDYqYO驪wDa,#L)>*ɤR'>~!GS[xfȄrWayS U>1E 3h-DPLj)۩?w֎#UO C5m˦7I)cIR1RL6뀅ɋ'w_SrSv|F!q&M;BҋUElz;mp4-Rq><4'jPu&Eh}XE '\F4IobtS!(PR=,zJϑcyVWQ%dDƨmLdڬex4ӨA֛ $էAlU<8)Q _*S<;iy=NOڒЊ&Ӣ6yycy҄ke"!z4"} >aFtnvy '>)W@*M]rWkm^p}vҳ˫1':iىh9j hBn z/f+C%RP:k ox1DVy܈4LF뇀n)!z)D2!O xs 3FMGPA|LKr0IqP sxj+⃭ajfjxgzwְȈ6#<%D+Rw*ɐ~* U4ډ; OQۈo7I1H ђF7nq r.9 qK/ Z, f'Q&&SnNPD)oLq%Rލ%}I%4!hȹ\ɧ t|Ϲb6f Wkekw];;WiRE}UN_6d>o۝rxDGLy2cSoxuDv!c^q4`tڙSu QI3s*#A $E!@h|{~_ea.8Ȱvb,bt,ڽ f*3a$"f4NM3ZD̊4I͢gfq4us +;SȀ_Pn Fl*b!I*Iltf" %3Xˀ h]{y'>-بOwv 5lm+Ee7ZÌY}ŽcY[I+`hegn94edYU^_eYGU̒Z]d"2ޏ`'4~z㴃IX!q&($GD\]a@49g3&<=;n0Gz1`"YWyIM*fTzS5ĉ[ib܄)jx^yp0E6/nҕ3De(`Ikz~>}gBzb% ۖnq)Uݶ:M6_&NRvVQ6W aɊ #$DUQ K82b X !Hgh+ Mmۆ@hzcbyR澬,n_7颓4:m ʡ~ޅ6ޡKn45o6auiPy+7?c ne2Ke NHTrrs(s@Yvn;hբp7iǾ}P:{^ yVC6w_(_IQ:bc%zECzgX>ɬVvPqm?tAx{ײ/(hg` rdٗyV| 7B;5Ӧo"؇73Ӿ*Ѫ AI*x?V}JҁtJ;~mjmmj'd=)Ib]W`7fg;CL0;rohy z <; hˀ?w t`iG=g1.@&@zGIc bT+s8Ֆ0?~Ha?| 8To;v˜tWE7hjPΧNS=6TaBGv)#}?bbRpǂ{y"QhV][}]Q?+ŸU`P>>Q -TS]; MgoI@ pCsz3~p+nH`u<2jwh  `!h+xk] N0'3ύF}JR'TB-L\xIKxg\K-S k_z*'8\DiR8]eBڄX$GP`)S!sScnbS XO.kڴ# `M %;`!lXo2z|*=F'q,c1;$ &U:x+y|Kl"fQrWYT4O?u4#4G I 7̓`:; Ī&K9cO&hrAώQ x%H$1YROJ)^o im5?JN Q;\R4|׋O R9Z[7nɈ3<-t͆ z5式drWHQQH %crHp*2W&<&M{SqKT|IeDaVND;òi"X ЍJFx(J ; 8I9Q5j@(?:"@O>ib7P mX`N7ڞ1lIDRτJv?n4pznXvt$ٵa/P_[P zRxNׄjdZN5ߑ-} IGra=!h5RfIoˁK}'4=t9g@?UsTc4p]Z Y[T1}sA0"==G L$c4}X`D+a"INOkA80gES(EIz<-D EzRS,%XHY41DQ "4+iMU֨$9zzySNh)&'X\H8aVtH~ pf,kj4| 7)#r&Qua]<PI"LtJ 崔pDzv29"V\{zNS'~Úc.HhF}稌&"(#M+T7N>鵏g]jm4]VJ4*R\#k}-\bdca'O]hT;#Ŀt}D2!Okzӛ{B1uJcQ5UW)+DuaWTfu>8*ǩf~_Ual>oW< ϧ~Z՟MU/~ʠ|̷!_$bDFh(Q~ŷR"-[g54q[@${,V??ɸY`#[F?^"q#Uprعݱ?# ĚvK?<7#0z&H];.o7Zm"b)u˜=OC6CEQBHd`CUSݾ 2Y{Ldi倒06Rr, U(ۓ7< `,].g<'f)Fć0Ёsw:깵! u=v~ *`dzEYeA(,qFdH + @ [בA w(iy9@XT]#7< @S!LS-Hx(0 C=B}}0{v@H[!>ce=TW!jϥL"uY"qх:7GށSw~=ulJtBB%T{KK]MԅpWQE}1&^)$}zW\nxv+T&Yv t ?tqU1@6kq Є kQHp~CBq`~\Ti~l64@Fрk֍YY͋1ڃq=+0/Q1V/]@9B%sx4Z4P1X:ds>qUQNiyD"*/AqK<Ž'~_z׺ծ'ڊ;$<3PĨ80b&$qiF*KJ)̓[+n9)nחJrp fo||?|ҪOjN5 VK`o~|j"?9lAbo5Lon"Kk;l>ο=?Ǫ-j ,v5`2xEL0E7Z6aEI2M톦 }0ZNnm^qiO."\~L$GغHG<")ٝ˟UIjy\(, {ɱ׹Hi&hPٻ&ŕW:Ɏ~]?^GCRIL1wZYtƍJCUɓef*6Rd cQM6^t`l 0NXmpzq|E8 x->W geh87dp:PHUwd2r/"W)[X.p!ާeZgOa,46T)X섒WAuD#rg(q&V&y,sY&p Ė&+Jps9F6]{"W$i!uJ Dz֦%MM% 4n)L Øk=):h<3iw&Qa@0,٫cږ-6pf] 颠L)kRp Vf%M9%/93%mgy/_WRNJd1Q'XkW@۵u}NnN 3q2`@JmLJb)BKk 9u3Pjɫ^^_J7D@vx+iZpb4x{ @Jr ǍF$EJr欔T Zʪ,sy) `ZaØ/[4xN@J!'\94g@'ՙ*B/*qI5T;³Vj.Q U|~>yȷbIZlT8#pZlDLȻml^hTeyo^n맡.F:P~lZu".;""#1 vrْ-M:5pք[ZBM`FLߚԆ7evis.TF^ܧ9K+1b`kqTpKփBkk\͟y{t4W_w,a$?ry~_܍Tߘr&?|?a_w2IFsw5'oE%|Z~@35yL55o\/,Owcxw>b b"`<6t\}~b /u9ia62TT0,շ!%cL0aTpb!2y'Bʹc$J$ =cݻtfȳq-P=r{1zh O=$}Lg =p Ris<#Tdp+)2Z6!Bs׷|]œR7pjRZZ '03ENDav*.[Ę؛v\L7lSխ)ő7y=5FXR9U)YG9ET:+r {2Ց7< I*5639[:v4)NU B7cP|ӓr2V&M H\\~ !ޯl32}h<[_E,zp7XthPׯЕg4'L(9KRJ,Y-z#xx+Z*za[gU&YْےL@%9ٴ^^DnD}vSB :E^}5=D Dr"j/ #%iެ<ko@ 4z`sbkpPg]*>dַy4&~5iIp΋LITemW)1OHjuتvR QJkPMdAq_'AwPU[LBXucK٬[ey.}PT[eԄ^`5*k٬Nu0h aU-vx,TSU)M |dlh>uz}АE31DMi"կ&ɣ~NظFMT@pgWI4  8$$5x,Ω|Zu(3Rd ٚ=;(+gp tD"'pܒ4I2U2$oN#Z+|[N&B?)^Qak Sz"4elU;&ϔH!RU'M!YQbJ) +"c̭o}'u0:yߓ g|=ףUP=(UsTX_8n@J-3;,Q}ͧb,ͳ%&\gEc,iD?,R(g^tbDpc8ټh} p!n"#VZcЎ؎TJjkXo$0sKֻa fOCxkqm7L_2Ul<#,]x|GW I nd ,-("kic?JY%_-I{n R{>qoҧvH7؈;TD-m-|%Jkqy=-DLaAxyT[Q؈ 6TYrg)ӔB "ry8%eChNv/NJW'Y\X92ll,\!TiL%9* ɦbrȚس$L8AGmJ̖oq 4~*Lr!l-VuRaͷ`rJiIz>k+Pݖv)CLMRؐwqR!d I"1P[0Pr`\URFeE1H))3"^҂0ܭ,m osfD[ RW7bÆGV!&{Mn)?KGa1;Fb2/0_#ZPL^<7ʸ PɔsVh2=c]MSq:*~/IkL(Z&qH-3NtOxEos_3PуlJA`bH23+,|WM^,QPd2L\MMo,ܖO ղ+b}-6wO'!t/)C^PckNXAY9>_QpFADrazk(ɡ6ҬΒcOP5 \hzGX&_UU{ZuIR:Pݩ:EѤ&ڻF R5@Z;{1sr7{p D^"UD {}>M+/øMb]KQK-.P8btᆪJ-6$.1(z> 5Fl`' Rg.笰)q܀C]ʂdHInӜR)|ec(ײ~XbL6~^ huUY@ d1Rk @ K?QzR]EDkG@@mUhm4Lڼ_wv^LLx ^Դ 0ʀn(ʊ٫(-94ቮFdTP<ͩ)JZlDi5ʄޫZGmLr>lKж).Ӥo=ZAi 90&l-: >7>7 gCR; \S|!G6r4tu36ZǜV[3IԮÊhrQqL=zDn3:Ai&t^|iJ_`ג^1JOXUL0[ji*]>}]903 "1>N Ӂ"OűNqіXT;qzIl7{TowcyTbYgii;vlr(aę4[sΞ߾MCK{r__! ivhh-xsc=lTw(zns}Slt^&SŲ|@:vU`Jk،EQ<&:Ԛ2@IBkmHȗE1/d3XQ$EyI6-$eGbPv;g_>;yca!عԇ#iPZ* +E%Ruӳў"+RHWxnylROtoF2*԰[`''R5x1#{@u 0CpMx #&Tm4-C6/=f6`&`Eh@qG򯻿*͟C/w;売z4_'F>wvQpl8-ÝX-8ΤVI+ڒSK&S_o'/d^9a$:)`4v<^]hyܱb)J{!{R !^< -Uz6{ 0γUEx4  ļj&yIKI!=?c?ұOJ{SgG1BMb4.pevġ8ȸq$e̤lHdn$fT>.#g8\5aҖ* ^ aDTfo$UF?ȾJst{;?0bOgxS:yL"]4z\p攱WU`z q/L;ؗNm0ȾrZ ŘΛ軚Ib 'M 8𿠅@uvaӳʬP FIb:4`q CWnbL?8.Jzm#Z կ~xp~o-F\:8h| !H~~(0{ݭ ;sc:mIfEXd+3TbpgycAͿ?H&rmdZwՑկ6 9C54oP$k̦2Bϑ4X`9F҃kRG΄FdQA\a2YUE졩RVjܛ0n`w8NL C\aeErsCq>{v/'iCdjҲɢ[?Jy5jTDHwtQ30۬bM9Q+Z>@}gwu1纺 9vL)AGwތ&>6g $XQTQCNQ)Y&Tb`g+!uї۪CTvQ'sl!dDM@q#YD^XUOySv3;T7a2L<; 8H/w4 [P)~QG!'! R^X]|l4AfR-Y6eL+4+|duwks$%ܷɸIlJVC44]/7 `ި2[0Jl ě~EAcÓmE%|n>˪ Y91bZ$ᩡq{ w>^4˸0,XVI)1-V\`Eh(|jhF7?Y\zIHO)c32B3bki" /R󙫲*LNFC*И5 ?ZYF3'4s |ǐm0/<܋G``O_|V>1/OCn/g?%PD(Jvhy}^l e- H1d|@NwC1+JZ:'MO `TEI9MouW]嫇h:bz_(y7L:"Fʱ,Ml ; o!V:uN:i2tEe4`cB+< W^z~]b,Gf%i%GL?^O2/S(-,H`̡`MkFJFЪ8Qi0곡8m9!pϝExA'Θ jD - # :V_B;ځr3Uc#rAy)WljoUdLz,1HML2Djh#6:[3ԈTY.&U8N}V/J|޻xZKZTبH!:\.'Aqe6EL>HvmBS24 (D:H!4G)jkziMCRLxbK7jz;{3>^|M1;CEJ:փ-@?h f(ȠOAׯWK~{YwU`-)0M qtvX1-{(0/*#^4~%(=q=2h̨ף?8S!f cƦ.M:x{K'?i_ ck0-Mƨtǘo%"k_żA=e3!xEDo=~EAk_jbA= @GXShPT ÆLbܳ#1M=XWQrs&B'>umცK@zkVe;"Vk8mCQ?-P}7a8P[U,`B1_fJqӊe_.wUW=[M3ۜG6bz(*~ܾ1BޢiZLUC{1AپAKssE*}/Pdo ̦I?aU{q!}wTU m rlS񀆻EVմQ\QAk^T,Ąw|έڵDUtjrw`wIM1.Y?d{54iD laR~N$:Yp}qꝉG)]E$ =|Cݧqק_ӯ9iW1=,[GK.3wh,TthH Ɲaj~!w ֲtD9(nj_1K>Yb'\ Z}R4g48]-GSPf̜H= 7`ؔɐR12M1s[ZtXEHŷdo_o9\l4&I[Bo_AEyR OBI-Y407+DoFd[Tɇq0<_In'Mo[" ğ$){%6d: icH>-W|۸Rʈ "eAix40vRPZ_cN!Fe[fU/Ib P`R)<_86)ʥ2H]ԑf"#|2LO,`BPee :`|(G 4SqL>MY* {fm bm?Jsy-|&P6 8C&ˀ!Vi=Y5{0NoS'ݿڝTm8#D~i!5ĝXQ);eNS>*hOeU7:S:%#CLPLt>XQDOB`#AS#+iĜGRЖMbi棐' .GUD8B@K~'HL$ɒY=a_eɒMؓOo&mUon]}^խww6?$oߟyqwV "y?t<65H DKíX,5d4)d%K*́FrѼ㌱gsڠ5)Xg" GE)̈́ jK{CnQb4|UۃdNTR[1*lT&pI!n]N (cTgYcx6294W=Dݷ=AL&3r-7+V! 4§(sS^T9jhU= ZLd"ߪC#nOtl&,61Rv2Czy"2ƪwn4p,c L3$8a M)"$閎O+w7.,Rp` Y5(Z\1C^O!%+QS8I[\=ЗY ̧X+UzYW#f1Tri Ɏn?~#{@.\Mlz5biwaʟ߇f;s/Yw\YM|1΃]V{ޗwƛa/FfeƒOD0g|2XOgzHW4aSJǵ 288$ՒL ̅2! a~%*і#t[&jV9)d^$T cmS>^L4*7uO-蠖dEkDL&$F@F#,ABo.qA 6x XO~q}"9dZ\qVP( :縮o蝞#x]]؃Ȼ6 \g[yYkij&%O:Jopr1{PPx yOlqzH 2@Gk >ybu-OŜwPaDz[ lj'Y[x 9d%Z=腺肟HO}_A=lLDoqfԒm}j;x8pyR_p[X-^9d!ݹC Vz!+3oA3s`o[o6!Esng5ϸ2WSWQpy 'Yc^_}8*#Fm{s1qwvLpE/.M{c71K(P9S\Hh:G3uу9m_} Z0{Oxe6љS4cw}!M/7س6NN:垆=$ ߁sjm{`,ǒP4+6QW24ͽU>}fg^[7?.+som13K򡼀2?b}:]o[;ȸ9Gϑi #)('᲏׼T֢9EZi '){-=Ϊb;rMݎ(c c 9%1: v:,hqi}?yf6X?-heQ9% SJԕNǓ~s1~/7 X-: BmZTWKRv&3LJrmprz8I#-woCyuqY1"7_MbFL>/fX "KUEOZ9AdNYgX+k;-.P&0!t\P|ՅM'Sh<ūd,`UF5ɯ.= Cd3"^=8@w8qj8-75S4ܕRԌ]uKI%5#W2nҟկ Cb;3rm}#n# %m*Ⱥ pvYE CˏŚ!W2#9Rա5]5'Ђ zh[tӶ="u5Юd,w͇󿺋Ooo+"F{N'֎2ognٗ짾*{/c@1k;H%'4iDE1^V%i_vg)UrZl1,?>@-紘OfXnSo_Ѻ_o (y2gg:2ȝk'յ~BIC:F_S}7'߅ޓB8b)ʙddj")֢d rTH??F紩ֹN!(=$@|LJRzY ^L*Gabscm-h7%ckEm.%]uHu rgZ#]jVP dw9mŊUr0Y/)TC#llUj..#+*0ښKКYUj,yd ) `zQtƑ')X;uHN^CjZ#ݮc?Ѻ+v[YCG) a3"Hˆ|ѧ>q.ŌU\@,M k[Z;Xno4b kNS9f~ZS9f#7A'~ν٣ܛm@8 z )۶/jֵi]kֵi]Aڐ3/EnHBAI.0_+B] b CU( -:tlQ:(D-ʆo/}}p^nvy];kyGo]k]3r4LүMʳ yq֮n?_kڮUg/_ Vo//.^j4T¢^!4M}we:{̫%]~M祾v&cP'I:h##r@jI\j3"0 r \y}\ ~}plq.#;f4|DxGԝ+K~>EQӝ\^_>ͬ Z:(XŦe+NGplAIa ޙpr f &$Ze@B4z7!Yh׀*8l-& JRmL"qvde7"W#;.x:wx:wCt+S +@kпdZ Km jLڪ&qunG9[Dr;m<^=@@&ѨݯST,LFblюu5z'+aPaA5\SYv,aSIjPZU]RF0NOJ  Ո)+F d'F IYb|N:0Lb}Ǩ4o '`dK_F \oYLY:x.:æ*UV,+ ]T5:Ik35KFu`b:JZp*|(v^xr0ڑ!#h[ݻ0Z;hv[:'|!uVײ'&\AN!:feVUNQmR>PiarUNR ʆ+Q4Q:~ :ov )qeRUNu;'T>HJ=V6$j @=H\Yc#B]R^4Ds㦞t ,Au1X\9dN>Yo^]fl N=xJKy)1 %sG5R>Qp*, `2ZS a,璝9h,"]jPoWeVfS U[ 艵0G?}N q-ǗJFlהXE735 jk͇;kN[HW}d'rucOuBW_s..黋WZ<>pۮH{ F9 |c? "׼iVĭ/L]WƘԢ݌pLf־y}t2.hޑu6Efm@;v{% gh0s Zƹ5G<3bgh7CUkfhC;H#32b^~zÚ\nvM~=.p~.,dh[5sځU`z>vyclSLF ؗGMHy0&aVmF] ņ/YZ~. ʥ-b-P-__N{D6Dsͽ&Áh1>"ڮ+U\kÁʙ=+v7i;=dz@cBpQgd6-xe gQNY`29p4wa*N vk(}I\ʼ}5qfOY4}bbHXu6O}hgf13ͦ@O|`7щ6匹N|y z:"l7\NEy @-{>,}ϡȹj/u{Xږ˿zm?jcߖp|va[OM/".Un pk ̠2ϭC0`]M̕DXխlBäT/'D/?SpB(TA)lQNk\TB!⒣ Rػ8+e'2yهy`UEQiE/KY-uZ*&uA%s$@CG4F]ski ,|?F-(aXKGqդ/j?܌uNLbW0T'CL{t23rR&)Tk.8F`>nIzDjgȚ-8 E%d%!p08Z*Ql[-$;{ҽ/a ;a9gѪӛOֺ.ŏEKJl.@a(>X` w{̐$jf Z6ML\ˍԇ)Di# ؜h^퉉>beR@^>bL\CȘ'\92O&\1j.;]R\ZWTeCSNM’VI}F8h eI*jDb7K1B 9HeF` /.ف{ƴL£lU۸q]k0 lŰ.hJ&V$l17#Q&fbFzBq K%9[ CEaQ3nRq~>N[J(̱4]^~xs /c]oۚqXW +gXSB|TS7p$)Rhp*DW&g"aB 9 TjUCN (1d[J1HξVϽ皲W൐NǕC|CkJ"v-/j|hH_F. !Ww;}R'8l)vRR-Ld떋MѦAGz-Ef|2s*0%YWgb:SpVfC=bdͽv']j2K ڷJG39ϩXzEs*^Z(E2OHv'ѕ_ITe&+cEp"Hf־Ɵ``,ӂT !rFAڼTgV4DhC~%<u=<o,9${Q 1azzf*gc ˙P(LFI\m1TxD9kWƦQ 0,T ^h^z+V۳ϐ<f<6^3_w"OX~ fw>MšD{n(iœ(ii{AE.8I]aNU5dfصD521TOi̦DpRt8֒Vka_, FL"@yEQw 쉸1nG{hԀ #6dXVfD |¬n6a7S̐)uv ~`-4nlXX6e`7-<]\\MLRy(0R2#&J ENo5!KAD [P㏮yp?&hP>˸7Gs\`^7}/񛹂/| ɞƝ suVnVgu9{+" z]%V/S "De qR7Ajo`RaL0BGC9z5``0g}-%ϹVYt76ݕgap BH*DĻ5r2`X>u.G1Y5r߾t fz!LJ̠u:e W~y:ie..1hOot:DGav_׃7B/x /ed6m2u{Ic0B"Dב/6}nns^M?K?Lo`ɫrW7X~nu7+MF&p;87cx6C hm T`o 'ȱoxAe tnh0{eJՖt/_ռ@;Pk|puǓ}rEcOd17Ke&6>InuornNH,BlaLyY=B-0s:Ր{*w+8)pS5Ƣ UD`5D:cw#a/܃Lhk`w 7%Z?x`7"L]oOl m^ "k`w ~oQsO!ֈǤ5[etsѱzOTf#2Έ̑N?V)q4|8Q!= ձסaɤBU}0$;y:+ՙoyGSEg:FA=9y1T+OPݣ>jɾf$ooy6[uc%[ #+Kny鉑qQK[YLD穿W鑂4x&n M(OjvdDݬńЄ^qC6 ({M%;W_1Aׇ<w8W&F3?PvMAzqcr &#.nglvDc_c?QQ.vuw998)z `l rqB^@uYW_x y:?Ƿ4x6*tfPr' ߾ܟDf#쇫]5 ܋;Wnl~t #6w"lL4%f{Xt+əN7p@x㍽0l1WW?LXr ~@ [CNgf9c s[G<J3gd.oa-{90x6N?AJƄq&GQi 5GFCglhwhcq7i:fx ވy酖"Yx'2C'# >פr3 &djz8.O?_^`ίW^.a>A5p|-bys!F$ G£э gƎ,XBp'<ʬF|8H姏n#<#\/l \edHALBԩV^pZkc$mW u𖲮#53NP<8+_p|с?y'?Wo>Y't+oKSv샡m3x>7sJJav+o@N7fz鄜Kqql(4O=ZR2{s,R]E֌ot? c*$-&Zl8Ixlw+1!b4&7dxIjۚ&=DlF@vBH7B!*y1]B>K"Nl57ELJ:g;2Ss&7VDL)ۋJ.lN @#K00fl$!TA[js`ods/ilk-V|ʕ1ނԻJjU׵gjj*^"Ftr:>tc].jNS-M`7[̫̈ J פ;ro\re' cd*Ǩbe5O']zk4Mck4M[XKdkzzCK;kWHShOW,S W !b%Nv[ٻSp1Ws3=/!s:bŴ@L'\w[={t.Yw$lpy25`kNٻd5ٚChHؤrn (y0o)!2:_I?P;Ѧ*5;AmH@>ݴ<݃j 7Zf&B=> V Fg~?k`PKgJ6]$O|B(zTW;thJJp=3Q&VÄJ "!"`yM%;ْzWA,wNh\d*Q6KS*S]@)*jx GtZ+AR>{Rx1]ꃍsdEʵڤn;In{fdR<iF1tT&&H&!W)*IrE¦JO([QO }$cRP˭BHVfňF73qRhjɕj! 4 JsjքYw骈U䩈b{,ܿ=w uWN얒"*dA{GMYvߐud8.TP"Q.OW?Ssv!E! ;HJQ=\0\sSW¯FP,1 mw?IQew:.@&t/F,&m>)v&uۘElQXᨔ}hEK/դ{.8%1*Jgs^gIQBcŸ/I*W[K*E|kVj@apI)bjNɞILԺSٵ3(u:r:NnSq(9 Ue!Δ(dQD='"wO(|E%ZFI|#h|x{e0^G0rFs҄"4gHJl7F׋o_gPξ^W߰e}K'K5Ŀ7c\e/=׭vnYb^.wvtF6#P(v[޾+g[;zkq6/mqff~a9+((*_<%3^Ѝʼ56;II @m%TU5+"YbD s!{E_@E1Eٷ1teP/D#aIgq)65Gi\% uBǑG2dLbkDY]AOp䠃Y.bF3JQ"Ĭg՘!rS>`(zG'(E'!x(GwL]72q {SK.rڝ.$l8 k{R r5Ex:-dHMUuZĥ+VG-YRI|tX![|ʲ QXȍ >HIMb)XeT,C#A] suVCL`яʍu[ rGJ0iIIRkTsJ%^ūTUy$({Q{IYRk7$ðX,{y:qtC6L@!ΨO*;"] ewçbװ5 T}̀m]}mSPі]Og 68sr 5I]*Z <~Ne-pd IFLBеpu5r:ϥM_?G\:>W7È{+yZp"_jthd<};e|͹.ije\?NEyV'_0\/߬ 'i:u#$eQɰŲyp#L=3d'j |/2 ގy:~@j1^[Iny\Qv*T>!б+,1Pbn{A"*:/)0j(fҨtl ]"WZbVtAB'gl=3P.=f=@=vȓFv ,g}TwJ aR~Z PY{A+# ΅FǓROR-uEu!)cFYHXf˳o5OB#вjK]tgJk(()yT9*}'gIX<-umkCUEߋ1Y^ϩ F)jjB֤V@w &5LOz7(eWUS-@y8Jb:)✪ .Z;WSen>8;S<]7okF[3ߚDfv%D/G l@%fAq{Վ&"IXe#|%g؛DT"1erݖ֡DAUU`V%\"5Zgk*[U J%ݥjRWzj=w&=%V;zHӼ#T">!yYs tV|D6yGHo aWc's!̤ӣ<dy_`{}mK6OxO)HiìmFBwf,0NBҢ*EĘ)g]ڜbHEx7[euQF 96xl$YTT6pv~RHՒĻYdtnI1SL"ʳJ1aDqoT|eC$*K}=/n bF/3zKŬRbTvcT' ŅJYAmRj &“<ͷGqFAkْW܂4mȾx'j;Gl]GN>bbQqHOc-:rms73X$r"Q GSJcJrz, {uYa'!Oʺ-)*H>S,Lݑ9[l\ Hw+ H۶W_r۾v 3|GMa5lZ W䒝aMPR]=T 6چ-4ȩQ#@,:5QHN7~]` 4DLw9+iKÿ7AbIyh#s<_.)K6&Vz.ZGA'$8\ʐ?CQw|&G?®fTA9-bQ^bC!l @a7r|wMaʢvX_~tIVk<7V&.H("./esu,At~~"y ějN!+ʗK3D?7CA{:U/uJ%}cB-QBaUD -(է?ǂR h J^]󖷠TlɂR] yX/-U!'2=ShbXJ~,NhiSESP@mo"@_v 9A;7&/;txk'=Shh@;dhU I`GEnv[Ԗ :]l 6 h[)SbT$CӼj4^W.jȎcУrDޮATeN2-h~gG?=="&=[dT(KPBQ}]Ot9}9S`eh[yWO>}/BϦjWJ`\ތ&1>b'5>"Y( @k:aUZ#}K 9bcWx^aFR`9>r;7긭o; Ei*Wo*pM:뾹gQx3c9<9>b#+W1~|tJXzj5_~p%]*ߣaҵ Jm%FX|e}I)GұE'6G7L$vۡd,מzc+%57$YC'% I#K@$3W {YX1#7+kjJ3^7lپޡ>},sAܑ{0Ih0tp%!seOf9 k/ ImMқ/j}&HE5QnqT?|>;Ĉl/sqk2v)H"b""E[e0@=HS-Tc\MFg ]@ݥ$c* ɘ2.&Vt..B2 #K!)c|2κ#|lzdbNչXGKhcR5Z9TXXHdRU@h;Q&n;圹K8ĝ]K_ee?@QYu'9+>;ź1OJ-$[+I+Leyc[ȏ,N>,WʨP4>}5S6v4A$E7F8AKhj)@9@2jk\mQ9m%+"P4Ҕ懄8 w" }٬>]8[InD>r!F'Ե<*X3T7pI:m#.9?}3]lm`B&jYR[iR/󶠓X&&_niOl<SjDstpV{'_%]o#W}̮HQ\E[>>, if5qW$2}]#+6i/M ȩƲqlE6'H} 2Z=_X7MJQ2:6L MukHoЌ>D9u@Bes";Y&CI+ yTuШz-[55lȋKDmgk2:*N[ж*de^X$hB GEIHuR:mjA,I)aߐ- 8ߛR|4cd.BƏYMm({f {>S!ڪqY[FF:LLLXcA%JZdq$QaQwM/fl6麱: &b-<ل!I+=Vs1{'2 ىWΘwNʋs72m"ӾCr -7 EHsȹniR-*.憑m,*mkTUh] EFGĆ&5 SS"ֆZiIl<0fD5a\AgHI'h3l r E#g E?)0go{8%$Yǩ#C CבVZb{z?nKfxep:;åHv+hJMEF|7WYE-7feCjsBBZ{Hk( "%Qz (yN3J#BZ+PuŸ@[֩Qm[( mC_hۺ6gT6$lZj.NE%]r|Ɖ8X{t̰{M~\xº֊K]{J<ӛ0Fu3k]2K/ߏOty&.'"'[VZ`2(53*e,%AoA0S!4m Y1$!r\Υ} ힸT&:2r5͢H*Bĺ'WŦMUoծ6iok݊!wt-X)k'OFSqZiTzLw"J4nnU7Pu&ntN B#Ҏ,ls ۯslg/#םr y7> \r[*POˏD@)1׭\g\#bDe\/ccO8)wj1 VH[ٜ}R#E2\ͽl!2;Ae TF0%r7W}(HU}=@b0%5`/(P{@Ȋu#2PfOaTܰqTǼBJ(k.Uw*1d=-06*s%ZbZ2-s\8!zUb%ܧ@Uy2si̳X®cޏ <{c \.󘟹t{UvMo5vMQ+0\YB moCDYSت3lx ΃g=V)LRD. eNڧmolɜ k6dn-r,իIH 7&\~l FX==no&iR(tr<8y $ddz$6Mfb+~]j2H@7`jc Iㆷ~ ldtbWa7U}ԳD~זj*pU;6y𺡅sƊ؃:,\_ gGd40j@kKk@wZ(/z@V2jhS^DN#Z"e5WP|-@IS1ڲJ*cmB:լ/7;P~k{\cŴYuGǾGzLQ.WT<ܭkﮒ:-a߿9ܜGײ>g^DKaV_,$ S'ـT?yIVӿ/)F?_dMu` l]nyq#t>1rw~ˮyJX*TFuEɥkUP*2rŝۧ?UcgCٚv e}#aUd'h-: z_|7NmF> g:[J {h곎WԢzt^)W,Bv >oKW/ik ԂP+#Ta#S<٨HOۥKD _h;9 3jˍU ;h7Hu8h(x3b~m-s/uhFy&~1m0"}PHWmbm7èb4U: !8l)7|pk6UTUD#<ǥ>ȦU찟#&A<Ӽ%_\}=\TJu 6V #Vuk\6n1-9mrH5uVYՂ P$U:;8ILL+džIθO/ K\nszy6ۆOӆk1qwL54Hr2*ݺNU*4>msk,EsEh]:n0C\wjD"jX s2ƞ3|Xƛ/cARy Y~NGY\U3`@U:})6\֫xdy7!33+ Rl5ipDq{zCDx]AEܘtha/uO pܫSO;L8sNO}.7%`UaP݃U X*X`~FƱLXVubnuKUh>K;\Zr^:AR >FJsce]s'y@~[ON|MYd]BwGח7CĻV*G|V& َX]-ju`Zte.o7 -xkm\I~p\x`X&U {B67?wK5ht)Z𸰃t)RP=Vl]}jkMu֡NCu1y +̶t)\)U:ǎ`\tΧ ? o[_zKڊg\{ryyvDd:fX_?:֫U13^(eSvHS>" -~xy)׼/mGbOaβo3}ظ񏖿w=zmGj|6UFo| 3oj31?k˲2EkƖe5`)t*J+RĐR l?7k Pb5$VK%VoA DY/WoUI{aE}HQy}#•P:g+qxTAr}v y;z PŰZHpD9.Q~0Kղp*?cm1Mo՜\6,t Gb9'_|dlnNWwW׳Ev8^Ψb*-k9G8] \<;4"4F$hUrAp,vd)H%Rge &(W+AbLJRn d84Yy <.xCb%c4Uh+orA%5GdFVわ{6hơ0AE>jeg SJ j>"2Ѝ _j2sɼ&HdMS:×'kt.d]1J)g.x.jqTkrC&C3}2kۯ^Ĉ58n1tَXJ?ރGn>ڊpT\_/Oç}o.5:Wt=5#ߕ?K i i} bù%ǏM2W!MN:9.ocd!j$o||} ן&ΧMOm>il"2DR2;};^vo{=It_[Ԭ]&)6W7WS F!,DX U1HmU`}{7="EyN6Stc5d61j;$~YbÒѶL XP%RB +8C%B l'mЌelr#ia*òbhƸr֨2Mb"ac0FsV.b5h{hgl+q{cA𥳐s rlďTr LߔAypOgZ=ioHEЗ4:^]adL{X`LVleWdSW\EÇD^u]f0G F!RĽ{(zF+"Rs2E>Z*z9$vT. DZr;|;xH %;2E8ɀim\fiJ!عF'w؞t{D58Jr-sq(?"\Ӵ(n]pZ@-t͊i'~|[s3D=#bsbgU&'nԬoYgU6g׌f$jf|y T<i kP!m[s53I觗 }~!%+1ʗsBFt:9c}ݯTfJնqtźm.E$jԌ{ǤM$F\:cA T3 b2JJfHj[5{R~˔]$>7RD5NEG^֏dhQJ%INOfb"^$9#N5oh5!&;"mJ SJ=o|ƹ"JZJm p[Y)Dl.P fi䒓kNyFQs[ Wuh WAg1%1Lbp§IBPIIJ5ND2"~kJU;bE!}!y>O&R!Y3*(Br_H^ ?g(ɚ⽡oF9C%Ƀ,I6#ևi2!:N:ԌbW1rhH)%F>!=r$BL%+_<%NQRfaItޖ7 eT53͏/_b>trWtTWݰų bx1dpJ̲ rγ\oB*R^|0nSǫ6;|v|`($4gP"6z)u{ȵ)usNm ysɆRPlw]Yo]aihV(oD`pօ˪.F@8 _N9؂+6NNȈ?p1a^=yI*~w?VGpKajk{?|6\p 1٣T܏<4+\_ό_GWɇ\O)E@rI^1 ՜9_s AG /D\"V8R,b\ES"OthpJꨌf\zɘMif|8=rwoum`tf g)ȒJ_ЫBPWFOF$u%)vYd(OB;t<z9| UxTUnuURIFz:s ~Vb@imaX~ yuy#P JCܑ rX^ʹ+~W//G/ƄVNh :L_Y Dm $NNy b~Ϯ0y"uac"x sh"BV8j4??\/ýz\%nN-ȧӧ$xRkÚwDu:_&i|aoWWKkAU[BAuW#||״ =tޕZWNWj3{>[zOJ)PIh*):-e9dPJk"g1z=fyʎ72s628>jzw;B5k%.IjG7(`"A2楔A1+lyWqQŨ̫>}P9~21y|Jݕ,]-õo9|3{.``$m2}>O[Mo57~[ko)l.1W.\C?O19{$p&K_ V?(d a&e"=X"XhnRgeOx>y)}oo>Gnzn3E $yΧxʹǤ@\X%Uh;i鸄9lx* !0I/<:4rď0"E0&̵RD+K)SC!/l1V"J9WqB{eetei Dmy:t^8Y&CJBHča >I]y17F-ol.[-\!e,0K t ,8}JY"Q`,yyo)ix+-g$=FيGEenwtTr!N޳"1kJȨpĶB#pU $?sz `ʨ,J*M㇏Nݴ}4oX&HɷG<ٓ.`qQVr$gGPT0yxAZ@ϕ޶Sʮ6@0UHUV"cΉ./(&h뭅v4n r֯ SӜ`7:S[(ŭv)@W(a }l4n˼<$)!'tj.D ˀRD9lAa0{V,Ђ94oĻ6Ҙrbз(xqZt82W0d9+iR;'j$=of+6Khk@joU\ޡqă2V`%b;kBЭf [QCFS YF-(Wh=77T3%^YaPSUd\Iٖhv׫"N=%G+Jr8\ƶhs g?s/e}^/%țUA@7wdU3 cn`[敿|g Ⱦן6Vzx>\n_&\>ܝmT h7%.xFiFȖܿF>oC,M&Z3z3`6`Hv=PDݶ?r6ؕ!+sWφdh\:Q{!pTAzƄ@5j\ js2=Ҿ M'A)*EwC |JĎ,tSњ{GP L]aTօ9pƱFk}Ny&lp!aaH\QSdF{IZXB~Ƿp nk5ȚSh;CON]_oPUp|G(a!7R_k$N*()Y%!@>9mjjۇo@<|:P605XT7m)U1K-wir:?a8"uK3nHS0l"U- [ :c[KͭjmAnkivO0{?'q>Y +3*yν\*GRk3E0C`_2ezڎ: $"zORCۢwHCl'I@ &ï)ҹ`yjqϪgLϥ]|@Z# J:bftNBg2ᝡ|zGKWoPpF1 ;?֞7(C;G%]( 5tM5Z_s?[d24%km*%wN9mJ:,0[)}_U͠+š\gV@ o3VV_ [6p*U~5k9OX<}/UC#v+O.)9cT3/4?bGxpnהsP~AI썘nF&څC0x\+*/@A3SȥAS&=(6mD>_} WwI2\:]ty$K t{B?Smo6jiq(:;J-q'˗qY3 wl`V[0(-MpEm2EoWO2ePBe|WEYvs$5į HrnbK_ܯ#*4Aۙ)+laY.dLJ{lEă TJL(6y?vf 㩭33pCk2CkFd?-v`he8`o?04LYGd&G-lI8@8}VOyHEYH($g# J"S[Ϗ8?euX:U;:WX Fvh@uu=ĭq:5bm{?0$2j[+fK.˸2..OP\[>rOQ9p^K+8˹F{AiaYe$c}򤋶j57TZ^ EoHJ9g\Z C P&U" HA2)͜Q9G,V)onrRθ%%}<(\O_/eq< ŸFnoxXT@k2e4r5UbzMXKX+ŏpu8'jM>Ja9J(PZ;.w<DՑ_Fwϯ}xFڠ 9[Ucwm=nJ4pAb A'sNbEu*)XR ]UN;\Lnɣ|9 (OB*а*7A HF*iLA X % x"N4xl,AD`]Mnt.Y/<E0Eo񚊩L]y\rgG'P Γ:lC*x AXsםriQ6nx9`0+[i2~\pDcc"QlxbA1(,Ne)7be,e"%vPC p4V1 qsGIJYhx R%b 1$J"JSN60ZtY*2!1UV{P(6-KF:qx`0#X9BR9DIT6UaU%#)x,`/Cw+0@plPbY? .K7I2]ޚهHyH+j);1qHN8ƴ2̓~\iL-'q3䵘uAFY%{dž0D W%X2&QJp"юT̺B(ʾ?`Ֆ rrA*2dPeHK=RT-\l<][DTED%[*Ka dsUEyaF[z!wvɄ5,Vf.pq`7"BDP6PaYEU:P @EC8wQ&+qVF7;;PWA3?BBKL.TYPa d%HPT+a7Ѽ2u\NvvelYɖBr@sENvM˄ D#B!ݾu9/W[C["LE+[-ܒƇ +|ߺtET l˨֢a :4Q Ϸ.;U6$Ъ ~ g"+!Â7V|Ox'DrsŢϞCؕU5OZU gO:N0uB:m?N{,(WC7֍vY7 XK!bI#CI1*SmFkrOωq@Bt3wr.\+ho^S\"kNq@<aWff@]i,"g) AM'`(0 \:t]r3q@oW]raKcH' TJb#Q*2MSiIbtA-"(~Χ°+|~A9@pK Ex H9=V(&B60:5p2+}J )zs)+?Ya#6sp%S agĄbG;wnћ[fpVAJN !&-A Q>=מ:L$_JV$Ӯ]!q }OIƑ[6dhaٮYk/^im߹y)RR*F?ˆUY?L'[oM׵?,9zaDsq=^l4-;uq*2\p}3h+Beݞd~C5^04[>l{zJh`Tv J:~{8szL.U!09@Yک@*MRV 'SjT~s3Pq֯227X:UEJGGe=^SkayA7dѮT-*~okw3so&L\Y,%`VMiWbEK2qor(=\_ C|4kTb !Yr,Z0)r&L^(a"K- N5UbPq T󼡜m͒5*k8ȑәF8S1w# %bT=1f1L'|z2MJ;FX ]Pmlr!WY%*f<$׹Xb{n\3?isgN`. S \1ފnMг4[k ##KxxWJuxbܒnܕPsAcm{CLv5δU6G+xU` na8֊{amh9/ѸRil4#a3s$.sdǰJ`S! χ= fd$X,3v1d'l I1%UrdQ]R͵:\򋝡5f>5lFMqTCBeWvrqٔ&V8]amGeƄun^g*K7TFjmyM붖9R^16ŘsX~f]*=1{WbsSk,*Q(RU&{DWڥ$ch}%ȥcı@D%95CkIxBJbٔ1624$5[m}rxYƋ4Db3e1Θ"sb0~> oz]8?! Ǫ٫^ڳwW!"k&>lwCćыҽ&ޣҧ{kQODa1iqSXKIqńhk,S"pc&אoa bw}>9lpn.H<Jt n_aH 󌸇UlNWFC>u&S93x >n(ykxEIqFg+Mo?:[1cB( +GiK#rf 3|WpqVb80FOFC?mzq5/v0Jlݏ'n@R7Z{q>m8U}^r}2כo\LO?xN[n?q;wUdefC{E0FN.^x`f@u[ XZ tdt ^^ЉΟm/I׍]>];F`LgLຌ;xoҹif.gw?6xH֧އe&ەN0fŸnsdj﨣.PE套MF7~_iwמ,<:ՠ׵gOXCj:X M<?߾>K`0|v|;ը{%S_o?:O?v~!,^A`0qc3jTIOo x Ǧz\gjz&SN1?fa^b/GWTc;g2fVtÆBGw7Bwn2ib0R(F1*JqSB;a$t9N0D3M7QRelcw8ox;godL]p,Voͽ}6Oɇ5,yd^̅0w r\T =.[r_lIdK>Lٖ<ے'dK-϶ٖ<ے;m- =IaJ"SVEh1JȲr-yd$'Mm.YZ!Κcd~4gdLwmmhgo*-\r]?qݣhZ(8~P|EG7 )1A'Jh.{r$XL)'9u lhA\(DKT s QRYHV835H1EwPjJWmBTa<#\-*C%7A_ ا@ 2A_ >6h 3b"n\ " MS œГ M7  LA}7:oA8" 47k2c0\Pi)r*,s eĊ]-+v)I[#ldNM]&Ŗ`J*ےD}}G/)>Zyߌy~ s(tƂvPgZ9L0s+db.Y*2LїTpS*0gG,ƊXcqvz0Nk]u6O৵ *UVW \h=&aH?NDK2(2юN09 o/2iUlr\Ŧ!Wզ!.s܂) h 4L82Ϲ$xF$)k6)f]/gAj8u]ޔ'CHƑa1 cZ>s0u򡥵|㇯iG"{:ȶ)ט %:x㍧l5}ti Mzd߯eY/j3.~yqԳ_Ջf߼S8sdIYpʹbL{XgsN@h[/jtөl=}p{vqU M;㼻7_N4"]db>rVdT G]XgIQ. QTNrYJ Xy)0E ހp6NJ@N9^PIґ]#Wn%샅V,] fuFH\ШauLdjkhΩ̝_3zyP BY@jRazm%ɰ+$&2 0C'k't /fB?Jd[P^jR sr @tx.Pڢ)vY3,)XGB s e-39Mh 3f +)231SA>3ige m +tDP ^o26ܳ}Rz3 Z4uLQfCۗ<%kUltlu^K>c8zfdm=K|׏~H9`D-.x^|zŬ p8w"&3B 36 LsǓ7|;ux~O$d!yz;ws9~p{n[iAjŬ~: 9nr-E砌8&z$8ekڝأsaw.0KK6q%=k_ _ޮ?ۛ;f|_lRZ廡}}xf= Ht@M*6*5)Uax>q:I1tbN6.8ۇ18RiQ Ctz`:ۇq Dq Z{[˩:UC5 l-fƣd1}( 8$9dG6H1OٞMqa n#K$4vh$ipOک:ד hv* Ptf:2?ةhs!`"a^t{ȯs~ =qN#uΦ/`O&YToMi4a_nk; tP_i,P#MIYxK4ᑚ<)/K 7a;>kh"[ED+=mZޙ5Zϴ@ 22}\Ez3{_sA|T@2k_(1挍Yb-1LV8g8  v:RTV8xWY#S̮%jOy1uVvPo5hnEa|z׫::ulP欵P#ϏzUZg%`Fir~z:y=WLjO-Ƚ^,i4U[o Yrhbjw%T,v[BH)xJ~+ЀdZ A?f8~Fτ=I;b)7Luoɇ vBo1!Z| 3cDMf0Ć ~9  R)AjBXiS ۯ6F*1mcjA)6-[; zøU#w߻Y}<nel&vw7~p{7﮻L~n,/.כ|rf͑3f#7waR~dz:a#_3+1fFzzte;tehZ6L+PY!6ҩʖL-8{m.q~>j*LGMeMQ'SqmW5> Ut⒁lq@R"_iݫo^xŜjj &\QLF'NF mɑט 7z^͎L$FX&w1pG[g(6(z͟}` |O^W/ njY00RR8 .xOIA1=y}n"P(i #g 5(.mTȕWa\Z@y@[f,\k! J"T`hbY.t#bCι:т3ϞN萭փ27t!E-da,&$gZI_S(XP3L >0n5qgC_R;*W̥b",1t4!N7-r<ɛ.yfM/h;6k*Sn!B6]ą5]tEԄ@(1Unmx}Q}=HkzTO,`0T>=,8paR;9ѳmpkݣlS;{D|n9;[js9d==z'R D=Zj'(7ܢv(1uv |vDj{̖zkAN!@4km )@(W޿zQo"r,FO"]M&yKEdØ_E7~s7Y]<{o2j\3zkS㻻Ѝ37x{ ~Ϸ /AhwF|PA5gok?/&iM3*C /uf 5"@+UI!)7Փ YN-sAŅ\uRTK8AYAQӡmsĪˀ' '@`bD$p|>4=Q3`WQ2RʒW4q.t(J;oט\iksr}0Ck Lgգw)t ^ uFm(tpLccȣ[t4+1~,o 띬]"D|RC斡â>@n2p0JQ;we}Mlhݷ.F;`E5S\MY|2ŝ?qT8e ;jꠤ[7m5B9SB})uJMfKW1f^X:؛Ҋ~ ֆk8%ͼF3\ >)t4Yb^@U'-/ɠt(%Fpr3B?_6m*U͍GwdSZq3)&y. HsFX6)M)KU*"SԒD.svHD]?,X\5j8<:"}VSXHC%بK12` o !@9 5 P @D Ko' Q92*B`ZoA )2V%` M6h9T.H섷" ؿuvi3T^{jJ4UrE !żfo7;[0s3^&8puө*)6;$LV`~!Q0}7gbywa0/?L'Uo_V%q ͟f]JYߎkhYjU#"Ne2eΙ\x:&O fɏ7'vNm-n^!]l?;߷yp[r7ӍX6oy/?};y0]fg-r*VT6yW5˨uZJY ezFˉZ$H)uK5mD}߽ݭp^Ƚ=46ٝDQ5r* ⪉({!%Ɉ͞}2L2@q--3UChiT 0HAKA $Tm`DU%>a`,[e)بG))f,0"5q𵼿 5ŻC[ .X'0"L FJIJp"ZƣDu6aS%gB9Bup6ڒsYkA,҉`d^;֯VyAMcú(j(⒪ \F(2U%R+ L 56R(3 (wgŖZ yPv".7)LfFfܷfz6PY4%tƢH6,j!؀$Dv ]gK'QI PK򺒃w lZ# Qj1O{xO7 xwF {#An`GTT|++EhF  A2C#ZM]G"QZsBpg30jMp0r_6aŻC0j0k vlrXk1M ?;:j;Hu4- a⢮|Mvq.-AyU#$Ѡh A‰%-<.*D|h'@H˔ ؆8GWi@_+=R)Z ytLރ]&]r8BцWebqw#M'1G\ҸE$(Ff2:t(%")=7nf` A Nn'GngTp;,`9pCtnGDu#fHsIF Hv @#P8mp iֺ}'A*A67gPowu % jh5ZcGN~ߛYQ4]m(m"E !&ch?SEI+tdc=rv3e I&@sH  PY OrCe *hiGAh:W*`"X4,RghzZ Xڥ zjl=$!㩭]3hMgRߎ=ctk=vWANCq߄^f )ӗ_e aWF?9R[ i"->B:pPìD+i9LDArnWg#5h<5/c`Tn/iحМp\{bRrNW?㻘Q;ԏH~V? D-s s~ĠTZl?RKcLxLt%nibt݄)7@v#z %sC!Iԅ0NV( mˀzo& q$RQ3Qޗ6o *Dej E$8u0XK-nDU7-8+eu*/5{pֺQpW8CϑA7 5}ֺ{oN#3r;a.ׂpYTZV`}J{TXUyǚn35{ +;UC%_:\A :Zs-;0['2`:x -~2 +2wXTK "l%e-*4 u&T*^s:& _|RîU0B;-׭OBD-w:PPɐō&~x?$<_Eq"Z1^Xc֊X ߪ:ސJ݅qQ7>=e_Q_q%{q,7p'c4Ѕ]CySfw=cjIJ5x_ իDͤ-j~Q-~=} !)mdB:]+AKklAwZ$$уKq4:RH0M MLōkTܺ{}~{#<# o#tɜ&5ͻ7']F{O!6ZQⲞYǟ2' bf4$*)mM5=E j^61~60}"e@h_hP†Pe,1y2ޝp_2d{Hw)cll8%@E58﹵?PX([*6/ @3H~nU 3mf~:P)XdDQ 2c婵cCB{{4cVr[[ Y;<-+k@#_;Nx,gK6r\Ja5Ye:%gR+!" '6Jc H~dD[Dc @- Gk^i`eďJK+5.sJ.g} UYSu^+Xt[Ժ=M걟|Xlzqs𑕗_y|< {8l5/|C_0Iqoa #?xwx:/ݤoӡ3--F\Vտ3K˝%c{i斤&虭=#: ˳y7\ri3%8S\#5Et,-)!V($J&M. 3ZJ:+hYz mSrB@',%VT-Yrzk|w7)Zwaٯ, wc0dG>~vw?[?Ajhq<}W?hA.̪Vb6? ejjUs|ٵOKh EU[|uvo}X'>|(RGym) ' ni7ʍ>֖9i&{r0֚\B8D[aГ巵r ڭ-rMϒLky}kքj.!ZOq Lqm[|׋CYTӻI =) 8JgOye';ˋ/<mB/9OxI%tL|as<=uϻy0s2_km#G/wv~ȇA= 7N-$gY?$-%MvGVYUbHcnytm!ώb|D6B$NCA6 eC@ҝHxfKo (!I&va =nlqkpve \ቧ:Vyf?ȜĵT:2LaѓufL|Ẏ~/կau~1U zqW$JuI~x}6㩿4(w.3 }U>^8BeVns!^=-ɇ>.檰*Ŕ?JfTel:]D<2_[[|jue2&a 4FvMÅLhZgѾ2j{i@#fv=Ŋq:\s{t{E`t{8}$ɑ^G< O?٨' b>F-gO!ttNjrv;68S#lDJ6C<<^vgX`ݎlz 6p[?W'FA AN(g/hu DNjij5jJߔ7Rpgmu*ǰN!SOr)g  ^H.^2+ǵ? ^W2uX//-E¯R N``VZh0|[ *n,4ep؁])xh#3"1$Br@08 Moj<+RPjDmU" p|CjZbd囚k$ƒVJl\A9t |$"A%gC/wLDN<`JiCFP /Q^.Q @YH Qk 108a+ǬRT*,+1X; 1,QrE8T*馊K<kB)5Dsȉ*ۇ*E)'blxaOVDbd@% ^Yuݽڟ k?E;S<͂֫#Jw?t@~Np#DG9d1-E0 YHJsg,_0Um}ƵtPrw+Lɔ5J0PP+遆& q ̶!þx;wЙ^%Ph9㪯Ϥ -E iZ0T -12[MF.{![SC `YhDQ%*dPZbh륧ET VY+PĚRIGJ aB/R 1O.(a͐yfȷk'aol-f9޻An;FѲc=e- 1F,rr7 smZZMa|pSQl&e^|/g>l_hBW6m>[ gsJ@Bk N&7ӵl; pN8Bw\!߮a^lۂO- {c9hL 17lr'ڪw1\+lp=<HoݚVOgG;.

    H{C(fɘⱃ=B Y'LNpa.:4|-"ݍvp<(6Wnq02qx}܇*]0P~ڡvn2Q1X|/)5Jmݢ n}hOQ:=+Չup1 n}hOuy ҩ'iwǔ Hח"Ib (we B іI'aU=Qu g ;dXɬ>6Mh4%%J;.fT[PrV)ZЙ%D@"Y0RX̿X/ZQn}V߻SB82SJȻjA+a@&,(ܿ(BQF-yIJHѫbP5,wLCF``=Ԩ1"7#V-TRv1 tz7*☸?M狢 xc-b+vވ'ֱhaN:-}h/CB{8 Z!80~/s7.qG<JHΦ(P>Pc#/9>/ᡋA3ƨEPNBԽl+1G9Q17?? 5fxXO[U{{0!,7բH>Ը1}݂VmarmUVqx0 A@6MK~]Fu2`|RӢBёvN~¦JWg1/ 3dts4o.UCXO"{C {"D!W* ٖAF"C{3d[6m6=x{SXG SS$R2[ZZɵ$aIL혅%H%rY.ޮ?FPM=@V8t_An~fr++:г+8@^ђZLai(m{#2+# q>X'Z۠>艠H#6Ho|o HoiR+veڕ˧LF_+ J^b/"oSyb ?ڡO Bpch'>QV"&60O UN1@H03d#@5&L2)([[RqDʓ9c;UFV' 5DtJi1@Z+' k\ Ix,010B$Ť¡ **0X8T_2Qyq_2"G. @ Aq új} 0OC=3v}7a0!Dު=*6Tw",iYٯt[_Q}CD/Nݫ* u[oF޻b̢H/I^8N+[Gv߻De߰h>a {%r(cndfZe ӻGɞ:EA"E$$Y؁pPdv$3 BކZOz4\PRa`6"Sp>G\=a'e-971La>! 9g @7e!ZVQg?-`J3`g1zaϢo!az~GBCiCx>Kl a&Kg/jBF|3flCx(sG_q c,R(s`\b%bVmL/ Iw3LEE'DbRicW @29B`rbj%!d/-"W9ZOœ7UjiLUs91B?iuו^WZ{] fZ ͜0+Wb$[k$H͵bS vRIdpLfO^q~kl/oT=zRUT~O ogU ־RX,ǜxԷ;5=)׫^'mt\Ϳ/ v_6>I&r"V߷.qU LS0dA3Tߑ./DTׄY ΕJH'OQsmӛ/t1Y+?ý=X6yh^v=qQp`)zD?>\W{jyĀ }\9l*￾;`X '+t+zV͝$=@ٻ6n%WXz:#R_XMɉ+De9-P$II\ۘM\8r95k4n"w>Ý# P簭-pɋ -&:? l`4}{oF#x>Lpkn h0@X}s#@c5+_ K7eI%X !kkx 3j.:k-Hϙ-۪6ZNn.@BBa,߁a;v#ZvrQD l]_mDv#.Tc[G< Ywܿj=Fa&`xae"ὄ}6aDLED#BIp1HIhAP,r@b_tWV3(rE$0heוJ.vQ7#*8>g=&lEqLUЕnOks@юGa^l~cz.k͐k0nZH=nݪPhf)b Gq4sɕJՄgsO[`7|/{> :+`-.*[C%+4Gd8-ឋ{h3%nOgld*mrsu{B ta ImQfppF5@39Q퀰P9NDR޶uvĉVPDKMh(LU*jS0Dsa]aQ=T2'2$Z.w.;A̤V%`MՂv+ThpVGE Gţص1 sg }\pd#Oھfۜrm?05\ᓵK!qco}z#T ^߼{3 F *bWLe!=PJi*?8L Z;jNRt#ݵ<3`}aAwªjiWP' 8y<034yYax͚0spBO]+;~3s1kDZh#p)h~t>'^@\bRJIN;l)w^[EiGJx~`!JJBYOZG.=땷ZRyku"+\н _)zEB,Fz}.ZY]f q) E1 !AQ^!G"PKI(XE)ν(`[!qQZœ)B,_#7-$@lw栗]9TV9 (YIMċPtb2~awx}Xg~㇡Ƌ}}iouy24!J|Pvt恠~xYy{tD -gv²Je]1=XbQqϏSRi6d_+/kyʴ[Eh RVO Cf:dkNK њ4O0 T" |x] gv8Ѡ"JB CIpX8 UCs^7eAS((=X8X(]1g;<.~,T畨*T|Aܘ?,SԝE6m ʷYQv"$!4e(US=F!(&3x4JDTvR!*tR TS)!=&0錧ɿ^)VP&'2Neav: 9xo$o;|ow`}Għ\p•3Tӗ6\Yyžy:j)NYK)^!b.{͝X,Dı3+xDBlNB #322"e>M "sPC#EgWbLW og,Ngq8>7 !%8-ֳX!H8Z5xŤh(l"%ِS0 4y< > L!Ӥ~xt6ꝍ8 9ul'jl*/8H2mRO(AoYz6ylE^J-cE,"&,wRqAN0K4 ~e*Wv 9]ʢ`}=|G3 " !\s+ő4x["Vd_spI 0A8LP&9uT K j 8В$Z gʮ+el`s.aύPVq%3/4Q-,IBEJMB poԗf>L}7c1G9-F42c I~bZ;G1Ŝ Tχ!nӝ.+*$H$y,<3$u11j3ZVWyol k'8Vuƒ j5,L J)`z&R%K~&cO[ T[>X̼ss0"`mpG2#& ԄaL4ߪU@AVU;H!oj2Ahjv96zn@)iP&x0:5PS Ƨ]Q3?Ã4GI`;3NR),2X' S ZP9PNm=-ITԄ#$sSUɈ-#"*4Ds\kL\dqz׽qi݅Ŋt<=g0ˊ哙ub5i~($yCG.;ɣ=:(}C㡻--q1CEGD%JJd=%(tqh")D`l qh8ɬu``oc4#M k!4`p3CUGL^tgy;_n0DZ v_5|?NP*VsaC$B6BEct3|L<̲L>vb6:@pi~-{ZG@4ާSP8P_\ c J5gdm_[5\.ZX>iw8UfV|k4Bwt9|̃TmeEj1u9/GBYm7ϾvVl옔ښ) ࡸ&xΆ1p0*%5xպ D0BՂqv2\ Zq3w3:sӵF]【UK%f]uͳ+VX+Q{v*VJή0hjl!spk{B:skK(HSDLqRhS(T S4􏎉~ڙ ԗ9\2w.-otoI?'~s|=T,f4t508T\}o&󿋿uzw4O? F3v?=2o>#$^%wH]~{W?cŊ~2AyycĿ.o|)TG?;_ g=?"𾌐];|nMx_p1)<2 Ϭ̰Xiƭ{KrLIfdw&:ٺ7LrQF.ߟXܣyzX迀M|Y]'L.x> a +AJ^s6>L4ޛeZR/ ʗ& &-xzɐ^;,gI5Riu%Պ+jVSiYI#ǘ"X(#=bkcPL?1C"1L{VBdTNQ,u\*㙏`c&bEXS&\ۈ0F#FQ$Uv~N.<'2aHP5`꧕Cu:p~Y\K4Rߥ Mvۂ=t8WÖ!ZW` }Ip lbpURe/h{UIҕ$|/NT EʖRg}3 :+S-)W[)h N o"J92 sc*FZ`q[ck܊0ZE \*RjkY9$L ~yj)M$2pG#4L!0YXL@҄(p #3mH b.m[/Ys4KXߌbĞhA揗9 |_;X|`uhZ'$S m"i}A4|n@gUqZUdK5 5D`)@L`L{7!2}'{:yo:eɐ"$C}&bkcj񜪃 Ωj3LX]o/ǟFO ONMaXeUr:An ׹> F{H/x5px S:ĬL^Cz|.̀ZWoȁ XHTmF;~(<8Hv #Y'v|+Kd;\n]E՜oRH,&~I> izxy:*#{>8`O:X,ҀQZfAc>FH*%{?{㶑 9aKFn E2iI~դkDFbĖHU_WWWUWU׳e xj./0ht.1FQ- zəGҦbCRSM 85i*y+bS6֧nW%FWO_dm* xZV No2n$T9OİnG84ʊk~IJPhO,9S&e[03nY0My@St|y!.:jH4+:,nTKċUk婾^Z!T R#*:Xc)6>LGJac)#8 8:E%L̑é#!NCޠZ1J3t4@zqgυNYc?}\uw8,"g7}W|A\Uܛv~xLb(-;!~WKo<9dt4 'y/5Xl Wysv^2KO,r5#!߸֒C}#zN Gu ڭ.eD;h$mVѲڭ ELO>²[] ʈNwTn51BMfvkBBq#Snh7 9 VRS۳Ervx3 iӲ1tYYŝ V$XΖd?JMtrѸI@HAJ3R-3RO!XDJ1/kHT4/w4Wָx3z:Zu$1I71ҩ`#^$-ǯˉA3<ኩ3s4Y=4@W[GSmCDJs^<bLQLq = L|`xg s^%ʐXRQ`|Ӵ;,wyߋɩ7ߛ9/8[,ȕqbL]o0U}s9|4Q'3^=;$:;/$Qbk&i Dߒb%%rُݨ7LbEpLJ=&>ڐ2f IUjcb#r2sW7D8/zݙm0R@e> =_K60ٚ?oT m)Hn,ũ|hAgP0LT2-(i:x55kߋپ~mQ{Fd_ƨҍ hxzc<۵ص.쁟1 -1f7xdե*nBv U~ϖeyÇSay@ xb~?&f~ 8QWw7n9nd}J^ߌ' ``ϗ ̑\5`46ugUfP^/(8p{B|A񧸡0MC@@ne~-5q]s|A kږ9x1ߗn̽q2S/RL5eWlqݜ˝qXÖ.`A;`=U-uebx_$6YS:+v!FO^)с:LV)y8sRJR D$u¤PzwDqFV˪1Ue^ucy/v6T"ة4w<+ 'I,ɔ0?B8՘BdIt!?~ ?)ԫԫ D$UW Gc9҅k>GEw9)~.Cn]rXTHlћ [;I7(Zڵ qzS5v ͲOp6f%rsKiKX5v$/Ax.U8>يZ;PYx6Nc<|3=Ċy78a߿q?{QmO* [5J.^e;;;U0l%%aYHчQY c@ag1/ǟC~0FhihZfܽsᗇ4 hTg$Ch.hr8݂K]XsL'K(QRbK0m  39q9=BeMO,A3NƏIO!#%r|p*21sjgW0i;~qMTb&&)#?ɿQύI\V-9fG?sKru":ft&@ NyqN%)1< S)} O (j]l{@Z7998E}H y\58aJpֲ`/9Xr8rMy8/ ~/ с3nab7,LPG^Xլ-mAߵ j.sC,A7OVdM.\1\ n5W[PINY2ޅ,(oF3bv@.(eowރ-x;t8,Ndz<_qhqhZz &2bd(W(y( ^lW/e 12Te/O7J&}x낢|E&4Lx}'VHN싛8\v'o7KAcqKvb8LL 筇W/,²'_qDjc-ZFߚ9.&jt{8a!.H8.Yld.Ϥk*zֺ\ڏY T \ Z۶򃠊tKDQV%Mb:4R%\Pډ|rM[xk~Xkʓ1׸m䲒 DTbKKnrc9mڽ"-_ dA;.cp\ x;a=3uʓ?p+̭^ V~$SN^?ٌLռY۴Eb؋b-ѦqBԁ0[N2\y(O8W(aD) (T!0ȟ\.fjg7%ח{ ) A1e DaM[$7p;GNJ8e z?p&VN*) j@)a5*eNJySkRjcF()|JbxJĜjqD=ӞXn K:a`5l:bq i̧L8|2iM}j(cZpa01ě9 =6<H*ɪF=U ^|3+0=LyE^|k7nqvwOGzh1WgaA\-ޑT}8G 0FYfl{ [bRHX_c}_CNfDsб:v\әkdh>gJup6c9p {Gd粻 / @8V&?LHM|pm򑿲7\ n1>tfc3v?_}Fdz^voӐj9`Fe*Ie~(ɨҷשF5h;. 8ckt0' A ?f[`6,O?aKmMg'Y{Ƒ+Fn)w )@Dm;;-w[%Eړ=,piZ#UzFwP^aśt]$]o>7Y=[a/Goo?}=zWNz W?72J{xcF5/gB\wc;LawJT+,'Zr軸N[{$Szv,|z(>z;!-!sl'5qc< šݮ52=If%mNn0.lcXau.4\{o<5=IrN/Ph8IVqKIO|V ܸw.EDedqKX.Ww7.YXX狯;]-x";j$w{ܗ:v9EKL#,Zd@OsmЉKgI%I)i #\ג+{-urz ԭ[f2P7p>qlS 32ёI~R=7[A0.Lvf^BnpK\ /PZ^{ft]l#~tc)N9mFfeCNe*Φ#ۖE_}cld{GioD1BAbz"t7s{^P"0MrspHp`P"68Dͷ J)=32A$>9Tqq1 Ӱ%G4y f0`0k-^h ~N8ǎV|GGwz:3@=f7>H-<TD .s虗tl$AuR`-{2_%iiĠbBjq^ݡl֋+Zgɸ&x_r>Ě8d8EJR*( 8eK6Z߷jrj&1Ja:T&y QΤNRRiiN+`P}( n0 ޼R؎;g>jưDqQH4K0c4) lAH.0Ì3#4Bqkf N JJ@y&$+.h&VP$Dn3pi]; " ޡ>gI*XkhJ5gi(+Xb(_B@4A*gHaZ2R39:S$ n%Y R%@Ȑ$6iR`%6)'DjSS3)!b ElTNa60&KR_ sL "P87\pz kbP"afеdѬ݋L 5"”`$igIadA`eK1+y\R*SYoX==8AV +ZU pAqfa\k{7Wtp%"k<JHfaʌ} X*1VQ=WgY|ǵ'~.k;PSC"G:TT2SQeG"PL8 NlLB5?٦C]G.f9+t-|ڧNI2P jƊHs+PS; 0bSTL<2l;p?Զ1ZOYHʒ*$4MB4beHu ՅH$GpX=TJu AM_<*MR3*A2\IPQ2g>"2Lp{p( `MCv}ZKЀnºz #!eoT*^&l1oPq{5,nIz5 3o6A^Z0ȻC C /aly/ǐwykTtAޭ.oy/Fޭ.oGjBCL9Pk6x^*q/f@ zv3%^{E4BAbWu_osXfҼ=>zw0K1Z0s:uz^hpwW|~텄ThgѰqGBb$yDI`$TgmuLa~hUq9I=HP\ !L;E}ԠkxkA^#@N~[dEHC/ֶqPcx[\umqiltv3QT,l&EN5JXQ b|y2bϨ^ >㈈,w̯Cxl.M}1E Rǚ c'F<IyR3^D0̱e 9N[9 YU! zt49EPi9:9",^_t<4ndy@9쇖ֺrlC+mM Zafɹ*gyZQDmee<I FDؚ:)? N8o-giRzyw8y(vNVurhnMCԼ=)Č )%%m} ~ʘ\HS%pf|Bg9B> NGNP#71H &5<_x*< VaG*'eɞ8 i GH~xc#hx'gnl'M˷FD2t~>p2`?jD Zm(6+k2oyy;hx'̰Bwt%*W]KV s j$>@B(qHY:w@@s)>:/qDq <1h*u"dkŤ4 3v;5T8fձGujrg><u(dϤ,j{=LVGvdfZJ-;żPpDe))6 &g<5>mo4Eg/on2IN0B Nbi$嘦2%4"+a*W⽩OWTGVA$-- .5n~dCWfWSdRD$>5W#_ZhDѴna*r$z{h$ 5E`uVvn3ҏTIu!F6 XEl9w5&\}fkI{N86ii#|_ϱ/zԁ>KZp%Q5|L?wjsEjd j|8t TCg)? *t}9)$8k|@XS;Pj2c"0 Ҭ\Yp7ᎍmky #1C &S+[?O)FTr-4_0._>{ 9G|L$xv 89]0.QۃkraqJl%+lU=`y@ Nݽ_V*"8U1Xd4!ːN';Q!11]g=g=EXZ BىMŷjܬ01`N!vzԯO_Qr5Tcq^2~^rհ'9~OsII(?`老,Ќg3qP ׂ"U3jmsߣNG|,6~mrWW7_ ,6lXnt Gܯ&E _gu2qq(s=eVQ2ljH鍕d&`|a-{+V+G[szxwfi Fk v{\x5ǟɗ{c^/[9ar5rsj[WXt߾y苑15 4j9 5kLrn3>ܚi%֪:0a:Zg]B_;X}ɪ}xf -*J~tGr W?2J{xo{D8n>ŀL ogA'q` XٞX\LRZ%HZCW"JVܞsnc9ٛ"D4Ffeغl:x`yc@bMHYP2PM7PPwj| >#y߀iF_/e%#6!Д6-`j;.w_?ϿxY+,6DYg;M"!dxҋ$e63wCG+yj؋Ag`_QSRi.SY,^]-laW_glv_ Wޛ^WY}S1zoT[^ڗ3\/a^.|#uHl+?ԾRjؿ ou]^m.N\¯~zUS=շfUj@_΢{<8Zk6->TԲb4wmm|Y, I| 6IVH*wCR II4 ɞՃϓ|/>8^}x)[Z_w?^G:0 Z+.WE']/֭?yu Ikr`@ҭ?yR@߭?aa\tOGw]V;~@'ߥ?ᵾ[֔.'z҅T?xqƏQ!~`y b 2TdP9z`/x=݅tGqP;jNc B(IGMIU뫺jtraf]1BV~ԆjH_ld~kb`Rq"1G- eO}JdWW|w??fԩ&+&rۇu_?[rUkx*(uC : +N6<:x-n]s?4 HVYAHbg- 3Õ>rSn.Vf?=+ZoXi}buVAD֯8sR /&ui{9קy)U?NL'ngrߧ2T:Mp˗ `:|#_888.sQ:nRグ(˨=XtY-wS ܗ$"u1j39Qcth:zo, "_&2 t6BPvuAgrP*9#9m;om9sq$yFZJR(c^ -0Wx %a!;uj|P;J=!-[(0:T?6E-G6\>vDѡQ5W8"1@S}q͸_ƩH嗅!iy ׁQ/h5S!g->0xMotuq7wCF*-z;DÎn1W_ ]ZTֿsmeG|^tHU0Iz0RTù?撍`uqDGF*UCHw^'gIU.{Pm0UCj4o՟\cDy_IQ%.}>u1Ngi1]L A*Kį )` L p-]5$DK|~K1{:99˩h 6!x{: K9xGR*V<%s\#'$u]ad(9~(v `;[`b㝬3<\v 1uzә~6+&D3 #|%jN)O5~ĨnU "2dor"{ .6yy%@Yuv nj:Z,rO6+;a[a[}`sY^w'|?}pOkp>pSDąn}A]Q2Jţ2Gg†#\ց/uy cy!+o\ݦ/eHZS׆8RT=<PÅ5.ʯOL&PӘp1WMu=&Mֱh|MGA[oKvlr$4N=y2 ?g}b2Ǣ?(- ;bPG~0^ %>o͟ZV&1{ϒ nJѽ>w?ZS}I1KȐZƦwzv Mm&ZtLkmhj U7JnH#d@k:y]6P`RyK ejZpmbρ~NA $Py~f&-B{4WD4@WxY1|iͨ*VAD(X5*!rՙJ@i 24YIgg`>3@ Lg`4BN~ Pڥin<.[u:vseh@и77t!0#K'-uL)aV$eIJR ]{SA(Ox ͍>Us03~Z*ܜ`1<(FhraAN ,89qrRu'U7Z(\+JKZ_hhqjV22p#0ik[ɩ(ah<,/LLV 8d]n"NoSi¶?V'T~z: @`6_lt| <ōVPH*aNqӠ _NU%F,D8sRGlC`dF{TR3^ ;]<䌐ފ-0Jޜ]YSl(;co!SIj'^\9qp0Dr{NqT +?MWk&\Ǯ _888ҸJ rpW;Rk),b%z\p+C{XO@)Vz%򏔴}P _)fK@(A͌:ǒ Dǁ-u⻔.F6 (~#HqP%)9Bkp!:\TJ/4Hq^>:z2\bVSa/p NpKkԡiC41+wUq:p뿌gN'/J⃛o1Tŧ~W+S~8l" WZ?|(Io'z맫Ӻ2TAhQ#M>Oo&ﯯn?mN: = 1] v:]~e83RF3m G<F>/-6(ApfD$M16ajtY0b$n<\w#GIB#&x@!J'<Ũ6<`(̘-BEs-yeL(] JHk!!݆`:}$s7nؿ(1T Od~;z_ff.cZv4n7eU#\f5eߧ׃[L=KyZjO⧫lLd1zv1L3Dt3fg^fAu)F?G^8H:*֮nt1_lf%MTz}5=)l>h@W7ܨޏ/pz/7{zJxhil_覺D8Bj~YM֗&+B:WC|+yd9h&wz80BJ@›p;?WOz#{wi"<8w Bg#H *s>_fw./1}cZ]w=?0Vxh-IJ*Ʊcon>A"6BkLUS+  )X0APc៙߼`5`5϶ꃃ,j|\ь\Xt*JQ"@Oq~Zf凪Bxv2YŰzU S%j1g}ȮJxMNng{<9F:v$P%9"gH9B(e̢'34Ţmev6ۺ%+vvF3$u%m ?ϙLkOײX$85ҡ&R'cu2{ pbifב:F#OQY% 0R( ,!1"zn4xV3GÓ9ǔ7' [eAǨO 5F"I+0!VkTw̔BY):A݁rAk`hg+.z+~gcߩWH 1Fjńm5_>̔jjTHU@JEHI2y?]?Ki2ƕb 缔vR;T@I}<]Tx[~G,_>fZz-4N_5E<-NS dgәi J^.Q' 92DZa(¸vB<Jdq㺫=zn2(^NA$, ^@ J)j@ !T易@#O!glwJa?CjjQ;d} "iù3\q<[P:bAi72kp-Blw"ZbC1rB{"v*6D-}%͕XBy:H˚rv:ώprA2H$!ʝ֌H䙓J(NRybyǴSK֙fPPF2#&\[g3r wNzQjѧT5Qh]7tj! ?Rт Qd!c9'F䨠CD{Ƅpcăk,8lJezUS7 VxtsSҩccvYGyOFpߕ{fQ-wD" Odٝ>'dBN\xZ Oz`S?ݿ[[8i~Sll/\ ];AA;#7r߃|mwjK?d>Ow͙SQ lksd=Mh>ާ`4ÖI_|ۛO3'FVƑҫOqP&Nll.O|l&( ]]t:G[$NȯJc_Zj~'XϙH ΁`H~zgX{zxk<^Nե3Un`;b[+R[vR^bʪi,Ǎ.Vn~,L1V VV=vp )rlؽ_.ó^4vY%T=ه[ٺу3v /p FHTeJ}fh}g+@| 9`J"LzbQV1 px9יШ(CֲL0+KNe [}PBJSegT͏Tץ^O߭d|[-=ƅΥRtKg*{+g7h|LK7P0d\Ɠ.쥣<僙\r1ebAv{efދzlYZt~\J`-Ob*f`NoV@dOUVy}?Ętp={hgi-*(gĆr'0 qeG2'ǨUtNfM9N [6X9KIL Ș+m=<*M&/.W>~Jx]p'orHֈ}Hj^i^\}7eQLj2ڠizqdw_3ǺڠoքH:_Bȥ#o_ Șgu'+93RY0?Hg_lipEv XH:߬ѽ!pi<7(YY"paie,~VgJCb-TC4`BIN&s.2Fnmߒi5Fŷ.,Pࠦxˤ>u t¢ y[m*¿:c_(Zq}NgVeJ3~ Vww|޲r,79f,aHv͒hY,:+\FĈړ/8\Zr BILV0Je@F_C<9OM@SR>Ds?}(O۟LE S3Lv&ɺ1C($&kN]0[,CĚ).Xi$eQ;ǔy \:/耵o,!E%@V9!#" !*F"YXjM5`ӿ^`[`"îmRx$7F4"8fe10>5QF= 8F=-^S<#d Kzt\+{CBj, O4 SFbnׂZbPGBa̞hTQLm.9ٛkznƈ 8%#Kh 94OE FdBِm6*{kou{?'غnOlpuΞϿS & }f-lCԁ)3T!_ lI{)7tZ#m)z/BWIc0*b.cy 3vur7Kc D@ 'e- ̑1LۚJeDEilG!J(Z I _j qF|j݂B;D4 .aPmqȍW\ZqB)ĸP(8u8kq')82rln#m `ylGd$tD2 ,M)by25.r0d~#bα9rSD;P6fA!좒il-00|uN4}[oqB`bf5Qrns ^FσcCƾUd[qb8@`;pP;SjT iP[`?]Qf}_8r1H;N]!祲A1:(=*=T%%DŽz._2 Z D vN'iP`d=4A! 4lt5o?'y::I tzwf2MX0yUm(_=G}"cgSlU)^ܙr!NwfOTTIsteHC{ǂ  m\:0~aKAǂ Mj }4],!f@@?l Z=Bnf{~r,(~mG+/2Ïv6mk]~ҜQ v62Qͥ392,BQ6;-*ʕwT։ O!92׳-+wRpv?NU"u Vۂ) E"# .~/}IrOΞ$Wk\`N68=`o:J fPN+?}B$Y(b1U/:HZ,rMOg]Ϩ^ɗ0"(A`ENַ]yջ֒E)R&ߕW{M& Szq=iKWW ۊX#]|' :"@)Vj$P(y0^ ~2G T2v#N"CcB\{N1 <@VUiUiU벯nkYPѮw덥VZo. -6FSXfWu8HpYkyCMǿe GDhPP6W'k]/&?d>OwiX]uu֩LDixV-x?{Ƒ 9J}C u}8 < $eǻj7qș*c$}UU=Uo_Ԏyzˮ# ϤwY7gf l|N{yv`9`>FǢNy!^.2ۧ @1d2ߎHii>hjbÝW|;s߱oiux>wfIO-lӰNr0}5&u:чE \C#u=ѯjPPZe$#>½BKBܡpSN(]ܹ^.Y\?>cm) MHW_8ݢoOa};cu;[}qVo9YL 줭Hn\<Ҏy\ ae'NVjci53]|Y sl…om{=׼ dxnyqC* 3P0;{zqҞznEV]^q-ϷwqvtXk6t &t#ӪೂUmie ǺTKWD.Mg*^8nB]^Puw9!ߏap_Қc&^=lBʯ n l 7_ϻ[R΄r-npkV[ qn7/ hܶU"u_7'eLHЖ'T9Z]jr%f/-.Xӏ$VC9@@R^I`FC Y>zJ/{?a|k_d~Bk3kG:b 1Lp#@䙛PRivzY>HuqxRdl&;I&8mubc{dDfmQ).J۫N(obeg^E6h̷WNTs\_^C)nvq1xKfՓd_9901kyB"7V,w<tؖ(fF4|hyI+EϞ|'IV-8^"cjy; I" YX5O-[Дc12RП"O3TkĕL}L}4JdnxtՌjDPedZ~+h`0VKfFf:I_"hm4T3+z!qf@XzYд0WN FՍg5P5¬0+!M Ui_V@XEڪŗ y٠ Gںk[ise ѱwbMg 0^Ed.{'{K ~>{f?of/ HF&}ը IWAW^3*E3 ZdgNi)Q;QprFmѽjn9Vyufy(G[r<{4t>9ϲ/[&uޤ\QAWςY%Oq(\dGAGvf9%ʽ{֧x~U~)8LHOrn י5OdJT }֚=\)jQS?=u/̗^w%~sQ )SĂ@ YÉH>g\?UڴPjc̀;=$|63a>o?}`.Slz=vۑ}4u}*Q|(J_)^6n`8Jv a&s, J& j*?l=LLW7 PǷ~jGRt8e\AB_yهPxa0;_]R!-o,,ܺ7j)Kq_ rd)gQ{;.~7f$#MY"paa?x#n|rif>!*bMǠpNGTv [,R!e5&"peT@\߽Am|; f2M\'bxLIOοkӛA )F T*`P(d$s[99uR|飩DǑP2r%[>!S4BL4taE:ăå!M%*=1*B@a8N3eHB$8B6|3N_IۿNǰCʓdzkՒf|kZ]xY XRl]Jhx"{E,1JS UΧX6믥W +yuFafA F#-8w1h! 4l ªP2(@Xgۯ _Gpumf}M 5Qri|J 3Z>=#B \3 +$7CL؉d鼍yMQ_ "biX^@D\~|Kw FI.<\3}|86٣ePxg1JUl8[[h*Mk[sCC7'9Ap|Kl)PS5 um҇J;9->@qO6Qv1>D8 lzAlSR$,#JKu=bM V4ev"2N\Zb2T  `+.-8b%8qmAFG9;*!BKeyŀ#r`u3iT {܁WPaq)cKAޖ(VJ2B/'Q!t}A{5ao[O"$~<7Ӌ;hXA-v7KcU* i>b-t䂧2Z${e#O @!DKckH:/  9,f(LO(@eB`r`He5Z! Rl/NEt*)p.v,hA<<KdFM6x+BHC`Q 35*zit +r:z%iD*yo$Tbd!4"t ĐFќ ng !A3pF9QB^X$N[,S)( ` qT( "p0R ȟ ,+B XNzCR7nbQKF15L( :` ũ*X'HW Z1 )t>P{a]P0EQ)F2=$Xt@ϟ;- dewk¹V\Tp.FQmwg\j̕#Q?~7+ZfxST+TRVL[B6x*4!֋[] ߐ hC (oѧ][WDe/b'9z_F8Ʒf%L5KBzps?2ɹJ(mzXrd#΢cyX-|qW2o -b~}/ٲ;ߠͥS#֠PV?6ړJ՛"ͱȲUMqx$W6ʤDcmsZ7{5ɠb5~5"!yr_LAiyIg 92zRfRWA'7$ň(%\4P T{gM1G{0?ؒt"ݙED܊c"f`a% i"Qii6F3Ԓ +K0CP|mݚ ~K9D-%p`y~VḀ[HDkE<]2S&aђHxaa8eN(fDE. 6dz) :" >aZ۸"KQܭ^q4ݗ rɵ葤=~Õ-U}XN~r7p8Ղ@Zm=[iN9oMW\^| OYwˇ,r/јF̩|3Y_&GRFlby1wM4nzCh!Sgv5اLv}U(ٞ0zevm_ٹ,T\4e` &= W0ݜ[o_5'X -uU-)3])dgi PgcqK*ZLV͖&*ӛnt+|m2!v N>Z"B>4 84RYcݡ!D 'Jć?>7` HOb^_fYyEՆ9z];\Sᦱ(׸mBKBSKOM2" %$}ʭxR? ~F3\@B])ʕ:@(KHEW.xoT+gLrgJAU@Mm)ཝ=ۄ7٦}賵}T٥[*o t}lk۲5lX44DyG\#9m>JNG! .T۝@ƴgJZy%< <+Q3FГ0H"iTEԈ#v5ǺdnaO3L>S9}n}Vxv;;k/ai {׳I2R6Ve|eSc#F.A}*uQ c,<90ƃ ]e' v׳c.*V\+ k|Ҕ\n%:^(zœzrxy%}58|rti9PtJȞ*kw;2øP{f,onW+bH~3 X xY5CIN058(ާpo\vw:LWÑGo3-9MyS{{a$K0*@a&Rh!SP0UF{v8υ>4W1dWJ;0Ʒ&:= g95Sg5$gCؚ)v(wN[Qof8=JmOgun% H[⫅GdTɅ(:BDkAgqxJ ʔ2PLw4 N ޅ5@VQT3‰*r0U{BScS#Qz437a!XZJn(tLn.U$( 8SJKe deXADrwŨ kb/,w)3](xnǨ2ʣkS/q3ɬC%M3IR "P!jB=XNݍV騄(ɮ(g Ψ( *K0$9gWᅠ4^{h|tG(h `rLd̡j88pzI-cY& :(4kR iy# K&) 6G%m< ќz)(N씱F bb sy-P"u Z9z^Z'9zc$ĵFZx6 ,_DQ]1/4r\.H->4 IlF4",Cgg&,pJh)j5}/߫IŘ:»4$),Zjjě/_洶8הsg.ͰTg538wA@9jM!aYU,H_Ŧ¿mS|{7n$|f@lĹ.CrD<{̟e< HrЮ$&#A!oA6^,Wwz2Ϯ>'Ը /R6Y >Lr+ ?BBwX)9BF8$z6GE!qK]7?U~=%Xx4$x^|!j0 ai5.f_ 5ĥT (.I =4`P+JtFR 1PWؖ=b żȻ瀁d!VL нOG $᭡u1{EkHbapQb ʄR[1-SL(&T[Zt$T }@hz?uŞ GX0 `KIu #⩔)5MY0c[L8J@17ϱZR *3:Ӈ!6΍!^HF:(FKQFAeQSDqHWxm3%(T *nhsj=dBKT17i뺥Fz6e D,VVpb0|iHKz,u~ZMVp;^ }z*O?ߤ ݷxjztUd׻?4ĬYiqA,BQa ,ԌUf$fI{R{PzQqcOYԽ ? (\EcUGgI*I;l'Emw;~%({=cz}Iyv>|!R9U#߾Dȕha74![?n^V~mdwxwE.\?:xa[԰Lb+Uf&Id?#HCHzbT —܇KFZ,)lSq~\fS5AZoG^q$O̒X?;4eb@Y&3"Fe 6%dD癝)xT "1 Fz#ːf@q1rEE3;¿}e#K\@;h@o ^yJRJ,Hqj1XJ!Rɯ([-ekHp%7>g}g ~%Ŧ_*LilzjQNx3PO D=I s:'f&7%śYy0^fw!aZq Zߏގ;dz͋uc/;Y;k*IBjtY{a9_V%)rMt̕^l0C|(oBl֮n_Ucu (h~b&9.c'wr>jp@nFMY#ʔO[ϼBZݳ}tŕ[D}K>} OYwˇ,r/јF)҃>T#}JjgX.bTBZd鏿| <ޮvf7UF9L»/+^xvf9Esx9]O&MPP.u5 YY&jr9MъKw <;Ngפ|% [5ÿP#m  $z(.D Bș}~./94#<뼿?S;wT/6qb>_%F&vC0G*p:í_nxwp*VP=3FpOiJu'i@ X>$sJɎ6HEǨ9ug9-'.?;h(*9U:B(rϿhwT fѠF%t˦-^Aqh9Awdk M?$뽚^E5jzwHyBRBBwXAW I ¦lJl`GnbxEAb1xN5h:ЫaJZQ *s A?W7oiрt8UB](a EGJ~T':Y,ш^ Q|4`CBqAJ{.Nnw݁h%~fsӈHÕYw?v!gtz%ӏZz*q 7$$1obcKDkg0oI~ T/cRlzi,|Jn"ZT3OЁ&q:%I"qj rJrik&}>TEr`,"ED"nфyKNqOqkY~Lح`PwemI 2mzؑzDu"$@yETR[Uߗ)ZfP;̀ӝiCΐ3):($=E⎚k 7eϰ@G 6~e)|HzSNbQC59:F7JSUi`IX귔aD7WCq+] m!e*6c̩R:f]|6EG<턧&V1V_F١'r{?QGv7`Si|W|Ѕn*<ݹϗ vt0f;/AK5Ԓ)ڡj_;&[{ 7ڕ ;ֽ2/|3jxh&_T^5}1B`r//c.1Bs+LCZ:{W:FHSayI.`o`EEHd#Nl5;6N?Z}<.;XhѴGҳRH)EVz4IXr׫r `ev;* SPA[6Ύ k,\[vQzKfL^! caB)ȬҚwy͉.d,&3* hd=_ e['TXuo(I"HI9(J)B; aYoOޯ>~U'˵$v rŵR "AP>8YK.1ĨK5-\M2Őʜ9@%"gVrA>-HlLpm6ťMk^FpSuD3cUW83蝾4#CZq rJfQA\pK8y4%\6)/5߄/TB |b^%i#X"W$A0j'bwBIuk!!y0dJ^5(u;s'iK (%iR2ĘRvLg(偖L'htgſn˿/Il~­#$wKiS|{Rk" W'K>Uu #ETf)Kb j 9U$5'5'~3_'e7W d͗]ݯjƩ4h0jZ4~,?Xt v(d5lg`! J\Ƭ[gS4Aɕ9`fI5&Jfd5Ti-Iͬ121֋p h">g* #*(8a"T @) 6͚#[>&ZV~qӊW0jQf_]x mcC 5<_b{gCCSkg`aB`pi-4&⪦3 ~18]RR.2x$춃23 uCŞZHq""]`3)P_p{;U2cJei A\) !Ɖ]1{j݉[WA)\ƇݷQYLl3Krw10!>Pq$Q|g!gh%/H%Z\hIye'avq{h śrJG&G)juՆ )^.n~pI29_&P !\*ꭍ&ā2E߽}%SY",QBySMC4)ǨG0=8&~ []{ _Ȝl:McU䣱V^[W.R>c`b<+bLO7W׫|V `nW?x;ClE4ilrIt9O%^DTLog)#ъ{s`G.@5=~;E"dKx%0g&cӛw^g?{&\uAM rlxpG7V46s)Gji&v|;~P㄄mL1i%\ LY/fF;T,Za\2+ 3Y~mvDndIܺs QM^](]{G <&RrL/r(ȍQtLBը%"?|FYFab9hr0qG'3~@1\: GGQVa 6 g ō-#Pm\l]F +#C";,jG{3bnH2E:f,P9&ҒXqB(m fNSz=%Iz및OR-c07w?s`n?Hj(Ū5\?u߻۟~3?> ]0s$=,2Mg(QhsEm6*)Y q$.?+ygn1tǭ3l-V}翳- GqFd <;Kفg<0'"wI)[7B$a3и1371eOM*R4ғ Wx4:g28fEHTW<&UZ[&.hog^:^L^vǞaz=! r`~=\a_JÒ꣇k+x`tV03V!r을oav\+My.11 8=^9f8Ģ|]қ=yZkԂ]a s%tș yT)(RJ7ޣ~G C^t F&Jš@FNҧl[P$AED0FqP9$1hd(#7X{Ib |l뫍HcWnW &cU;$D4/|l#dN'P1Dc m]U"jdh0 CDy[?==&q[^8w );W&m|'afc JQc{P 7c2PkaicCɩ!J<V M7jcLB.Ի[3.15>΅ҡ|`OHyhk5vLѼwx Y`%18p1X,Y龫P~J4-=@F|y/69WNeڻ#r7- ?5~vQ#J[|m밗g.VVx[kN՘۽VzuiӨr_M-ThhWEʳ:/ [ŌuR{ B6wd3 ٢921tH8n ^5˟@J~~_W9999*Kv:dXb?^l\s{s1Zl4Tsg{åC]|(A_nIt>f?ϟ1X싧i^j&skxJpևc!N,6_j (V[ƶ"` 1M:LYK%Á1.X@^p6$lQa;ulPpGׅ 1"_mk q"x.+*(_ȍJLyH J1#@hn|GwS)LKj+IuZlD:*A;E{K\X( ^|f4eY/-Cj>Ox*$qگ]؄H.ys wU+}a._vW\Ln:V?>TfQ Pq:5`4PES ˱#@ pĨx=* ľ# MgWC[\D}eeT0/BC?p$kϮH)J84qt(h5GA?')Oi.xK,x \sp@ a@{L ,^$1Lk)1n}W֕F(->x+CPDJ:ɭ?T45"x6~.j?_jI~bkhgj۸_a^jwh*݇eϩ${%W* +^BRk 9Mp^HZ,ʖ84nNgejgc#pea]kMёe -p[ ޚ"#ّu"6@Q6Q@֦&#P8בargyWZqLҶp M"Brf:DSv3Yv-*:$;U;K+D+z 3‹iv̟Is Jֺ˙/ڷ>T(gxASg]Q!wHOrFϟj%@8K\1ȑ!.m?T{fg? hK2dx:u hH)|qHm.)2ʼn΍,_1řSiAʍ%hQ2ڦ/ۼTcZ\ EuB F:Z*BMо~k"T TE$b@NC`0nf~8 1|ss=v#Xa~sY{§^wHӇpozwRoo07w˫oнia6{xC/47? /ٍfaEyi?4$/>O& Jh"z [!{N5˫Y"cNgL\Bbre +Y[[~V{i E>OڈɅP_AwI_8 !hT$RrV :hT"DHI5X%&6g hΥy1៭_VlEN  ]8)v=;O<).{kvtٚHXb%Oa!c_JB:Ar;IZ+-Y=(m5[E!bJ@xZW=ݣ~Z#dO2@ѓRC& 'ɸSZj̕]OA40Q%99n~in]3Bsj,<7gLڛ%-w@6:rQ+]zkoCb,#8Q3P6%K9 c5zt kd&BLbYeJ ':5҄B*ә̬SeRfqpns嘣q &kb%h^)ioK#I]UB[A*mo+)%8?|[RU8Q*l֛#εqpP+ +ޜ fٟKni *yѭ$֨8 ֈ}"Iݹ5ʐET[f8S52JƶX#H3uQYb9 XbM\"S\22DcJrq8e4k{#qp]xqIwW^uQEf^2t{; Kl+AX[gĢjO4sU{ntOmFW+kҎ5w-nZY54*jK^ɹ^I(J5Χ@PD92^vzRNEc#a̠DCjܵ]*NVnT:mY `"<>-xNsz;6X~s-G;y3I,^&"o^<P<᪓z-^`$C͌8@^?@iYd؈ve /y> ߟ>q5dW Vpli+g -Xu%WӞE%j"Rɢj|N#Hu4(q4j))EC*.J ƥM]iGcCrR..-}ԟǺ\$F(qS,6HCB-//&P gTY%d%wYq}6d.y4 3:7+דY!ؽPbmZ_& V@%nA' CMQU2/mgS ~1m&P]q/g7~+鱀 >q5N3sM~Sʃ'rH\*]p1UyjyWe%\JimwM 9\!ٿ?C{L䖈+5MIu=KqNvc]/LdMżeMu梠q|8>èQ2yl>E&Ceeç=V6Q V|\UMlӹz*K4O,tkJ˧24~`-=u?{I˨/=)3τUlځ?| ;vTv)L d&6HI/ɿ}ki'vNն2@2&q{c'pXdrG]JB1mb>7BwUP]EZ󪴶.H% Њ0TNn?}B< #G|r,d;J}[6FhЖ]SFr_#8s-{:>תױM5?"y6*7kwyYFv]i!dk; 7mWjD-1[tpC99d̶>e9k/#1Dp۴ :lii$[25U8߱θbQe4"VNDT]KUc.I)R$Cd[[s$v%ZwamJqHD#E!<^Z3\',U2Mw{U2CNts*w 9X%eU wdi*LC&BԂ,yCx 7'!Ҍ ܼP//:]6PIns>i LWКGf h^CyPzd)(k/;2}f%%ꛯ.v?egX[-f<'䝝PۥF4D!֤N '<ρ[-A=UB*ϝ3MBhvAfF;/͹ )i 3Hѻ2/r*iTrHYAɌPψUQ9XTZG47>D}XOP_ M7?tw~%}_}Jߋ>j96"$J+ӭ^ ˥Pfq99\J&uɼ+ w+-q4}`QrV+[ˢ_|5~ކ \x ! tgl 0w*OFVٜ?4JJJJ*JZ+B]M9.,M='ֆV*Nڤ2i4G\47}Q#:n5,FTKi^ڳW<À˄*Ԧ7 )=9X+5T{.5Sl3 <0-x>֣Pٺe|3(z e|8 ~9OiRrQ~glonsǻ;|>M ŅBm- 7R6K)% #G4 '8rᙅ\!qr/K񂺰$`[,թ%I)7)|cLJr4eSmY ' bR\ƍefZ B B }('ה{" >!b)R I#2]ꚾQ(+dRhB+|!KPH.5cRiGfbf0 (|UNAWDX0ox"r/_X.W4q2ř0vV+᭒>,@\ZE3`}/IȻ)t6 m|Q! A$.sgI{H+t!XOAlfdnDi)\I-b5ݔ<40+aS R4 BP$XMBkSMwe\ EӲN6l3H|$d $!45H[7ZRm36*La6Ȓ3 Gy8FWi%mK, RDC +fY'3d9n<}l&v,Tf#z~E|ȭ\@^nhB#RRI0Vklf TK3cŨ[WFĎrº+їlSd6BckL $Q5$Cl7f5GQLyỠ|q _u̶qF[!t02H.65‘iidTZh άx R"&G9 w^ B!)CAjY3q漈 ȳ1*3P&AEM6\xQ3o̐6mDuę/g2j$P oe ̒H,:RG^dN $H Ij!^ ,2J*Rpq5:m#01X5@ 7m@fD"3RM3N!)'au 3&W޺QiS-™4 okMpRErk{qۢm3 u:;*hIvvr9gp3}mo 'mFI=B6nn=AX:A&J8(tZ2J>Ljw=r``qx٬"3k1ڼa@66*( >y-2#[-ZUo7?ƓX>7v=yI^8p0({?=}^/>79C]7 Kp:\SH{86t44^ WwLooghG[ks 9fZL#\:5~E3t09쁞ӕRUh@rVqyY\3响4؂3Q*#Lhc50L|0M=i xuqB#?NfNGňȥ0o=cm1Uq1e  ᚍt0)9J3Ε7L?lb8FZaP5Qy(}jd҈J*ZT3W;ޱ׮qٲ9/;Ԯr*־x4:Fɤ.;f̆E)FL?xy|0308GʢRߍ_hC5yA4ܥH)JSt:6 K`8<.LҶfW`cIH=v*b/cm$[7EV)HMK>@j/PuF|: Mn^@:/=)z8-BϿFG]O>~H$ì.׻:n~{5ϒ/f}G?O&u/.Ǔw`տ\,7wy8EzbmL~)+^dnȐr&CtR6I#TE-KFvC#ƣWx:ш@yT.~on?N}lq*95Vs%O˹Om˓6erYWӻX4>Y/voPg9+]a=5_woGdOҪg"VDSlL9o'{~]8s&b̵]aǃHugXpOme)cEE8΢l֐@LRF[pGo86ըX1$wCMsFkRV;kc-R>}x8[Gr7M<]B %W+KYx[@/o8ߏ/'7d@Эd:K*R-TYeF*9^$iźqgGwԀm 3p꛻ٗ\UKq[JI+gUi=WZ<[V7{ O&_R;Y=)Ȅ}x虴kҞ_W(g· ȿ}]!}˃uO4<+! 4/sָ2c_|ִN0y)~XYHoeL⭉4u ߒ|EUsC *9ôg×8X/=.W>$2oȼM"-2 V_[Bs^ Z{t1x4 k}pF!пךf4;K&֡>aXx]'5B}OX`xvhUP T7fw_k1ϛPS\py-)%l>0rx=ILz0zȏ`s"GjᜆwiF!2Rˋ㔕e]75vYü hEKjKZ dH;٘oFqp8b.jL6(Xr8 `v9""y %^8[.gC2sJv9# Ą\r */70ziF$F((g-Kb _gKɖɐvpGMd&:>z&ڻ;e!]c3 p]^tMdJ\•ZQJ`WHk¦OA&TWKxJ=फ़Vw؍w˂0<䠲\p.N b'0 ZOPf<-{-ZL+UgW0) ynL`8oMp=/ˀ|kb < Z*+e|PٴZHov!{Ilxsΰ';ltDx\h|UɔC匒!.j;]'(s=V'tU7WycJԶT\uXǰy=h8{H<#P q.䓹&+@HPP7v}sCƴ^uhʰn,JCv)DPEql<@Z~ww 7ti7S7Z-W܆. F)A^|CG´:~}m\y 0^n-N YI[?RI Hib2vX$r\Ȇ57iOC4։IUH'V*couZykq "DxDPCKp'2/oaBT&ycǍNh:.\{:+`mX8+RjPdE;ӳQ:ۼw ^-#U^nB*,&~=O+VI^{ꫂ\.`f>xM'ݟ9}*<Ny.4 ޭ0@Jgq4OdDhUM={O{{Y6=* \7i V2,\&-[PF$-2iwg] T"1SJtP;OORI҈4fDEDZ njcbLܜ'"/iͤ0L0`Ju{ 5Q׀8)d]U(*eD*waa5@S\.KfhneWX n I(0^G\LLR)F0Ƙ6ĵU_'ma"NSZIrFՠNICmsNҩ*c ŖNxwd~fti,㎥L5"6 ڪY$,@R<7MN&5Rlxi4D1H2a4 )6mm(5zaWy7wu0XRw2k$# BRμw&XnX,%taڭ"9XE_هۋF1 PCQ \v#,5YIrLw;nDƥMmNG>R2p0V&"Lp.8&V si2M WxC' @W<nQi Ex]%fE<E;W7ubDxga)\rJhPr$Wt{OZʔ}T=9׼vB^4رµdT}i-"GIܺ$0ƈ~GYuksH}~o'YnA=k_YڈFJgcsچ5!OR tk0QS t .m+ AՕ[xF`p}v7r>\yn:\懛S `k͊"E5?\yV~Uqҹn5*"V3Jk򤉭Ǹ.ޢcѾU.vz0.!9.qp܎ZBQb5& .DeM&EF-uIk@ Vpyrfͩ86Mb3JPFzI5 I :p`5) 3o{XeE^5~k@R D52@8ML,r$ ht" H teW |9Ci6{Ъ<%J} ,:)l1!` W` [kn{gW:z[cKX͍M #pN2 |& Xs۫=3xg>JߍLr'k 98D1AȤ4/۸VBJkn͠j}o냙ٚb2M9x,!B71ǥjMk@R"r*]]$= AJIQ#e#F2$NsLy[vz;$me 9Y]#ZpԜch%¡HVWmkRw &W"wT*0lS~T KJ W_7)a3bkpq`VL&㋷M6)M}a(骭梀 aflkV7 GMRlX J>[sTVgXIE4RE"Fir He<c# ilVVpԈ?`űF,` PŠ :.̛{ `U]ך+·eg|>qټLMii$ҙx*Ҡ|gu{I6ic{]w<}GP(,kA FRp-&Ed a V!<4ޑ%Dv"Q$V5 4ŷwՉ4Y+jn1*ʷuL?{w xp]PMJwd%܊܊xcq,q$6;|Y!kPyCS_z(E۲e|^2 N "aFCL&5WmЁ5XZh\;o,IHcع^(Yq0ݻ)+qVz4/.BZ5a$X-k3T^:gt+{^2Ʒ-cm+a6v .s䖴QM0HмAk%ɫ?EZI^oH^<[ v^K~sVn!\pKtnOnk[,@0X`v>B+~ɳ.GV:y!,TKnm$$*vmbǏ'm ʄ,JWpIFb*7X9X7F&$n67kV3LI̩Y^Iws2&TGq\r}K` !pM!wCg'Y0 "w{7chFW>||: owSb"%p薬[9RuKVvyCbKQzNJY.FIB1qη\EO7,czN%~dQƠ/JtFƚ`ѥz_էlïm>Ȼr7 by7GkIaiB Mw>QH Jk@I,e6xLfRSCǏ`Hxdt2(FtM'Cw:0lY󐽽*AW,Jφ3mZZXrٿTRP(JQѓi6k:Ys ʉY??#Dƺe ~ey̩9,O aaj:5m߃e~61`P}gӞ/ZV^fo~K~Ozôqs˯;;8s `t NS'ܡcL`FRI48KnVsg ϑ3`rGӆqGpstDd)!bC e`YÑBJJI.p`]`1 >0RQ;6Ԉ;6$ EEM64_KZ%\gs@$ݩίT(,|TQ)wr@pBF-X!JS-(AWUAQtF5r8OWJ]vz XV3I)9RfR+ʽݦb'4)/r鮑 z5KS ާ?(l6mڴ͙3gğS (ZNENpDT}=\.O 7PA s,F2`AU|Z/USRk (&kK LP0ViCis$GϬ@!)T7kЩH-h40zwQK~nn:Ynʎ,D2ɉ $紟\sW.߰b &{Df'pyÔL`%DS>T3 099G4RxR;-Kovedzdg{Ic?$0-YXĤi3O};V$n8 sQfi6\ N姝9| d?Ű{v7ыׯvmFgO/tܟ G; 9{`wϐugw ~{roϿ~3f~^zo z|a~׏<–8=_}dHW rg G'n({r{,_ѝ/KO1 ?od4=^h j ,g>"ȇtϛ]׫׻oc.diprv_Mz5La_Y//gU{3gF b-k6=;-n6rWj~{ĺ >;=}\WXv9_s>:hb/_ia,k}9Yq37ٻ8dWypnOe*?fw[T/:>$td;>`Nքup3+v>x~мgfSh Dp5Y6 ƄS@;ipdB (b0(tNq7>%@Xp#  0m* [/W"1W3.=c!s3"2nޔN@>b >%,X3XpĂ#׉[ܺ6a% s%4ǐOB:sxJ]lszu MǻIԥ&uHMVn~bx4 cP:xBm۲FM|Ƥ{1Wio+ RG)|;Dء =PLBr)H9GYhH%fJȴŲb9K>W1W߇6%܊ٯn%= 2o 6PJ9 )c{a6v{iB 8N4SJ/Vјcz`L1=Nn第*'OU5=0;P U9P QI ʭ fća >,ć AB΀$EtL*}bv֌0 2j>z>_==~ɛimEuy .xI$< X9Md^;xos1Sc5\c5\c̵k՘ڲG[CنCU6P j~ABȂ10L;+= wcʘ"= 8\XssldBMk  b3JX&=j^2hbĔI }_PiiP` P` P`}@[ÁBUPj~kaN7P2DXҲA"; U ^f0,ۖ,aF:1`rkD:>Chi#t2Kω{ZbF#X]˫]ۢ=x]NZ&>N9zY,. 5̼Y+d9Q90k&%}P=>ϹvӁ$$Jl=L rYЃCR*hn9(3EIq1y+QrEB B9Hk-A0\;Ps6VFi2 cER4E#cu"3awQ`b(ՍA9tΓgIvwnIs4;͏ :-\gp^_ԗ:ثSEwfE_?o=r%~:tLyoLŧʟտ3;|tWnzyNh.+ys#Ұ{o.Bn.,%wnYLH` gxb"')Yi ]s%wgŷ=-&i)U9su28=Cޚjm"9Ld\hk9-I@X3^!V#b!y#jdX{) Nt'+ Hb6q7BZ$8>U <ҥIȾ< qTN!aocF t! Y .K8BRV!simKhd+C::{I5sheq69@}YYE:ĉ{= `43o5#9#%\%P< dF.BB¶pQ*_HxQ!&c˃ =[AE..t)Q F7/GnwIEA2uK+!gn fӨ*޳P!]G`6pv^#ymTea2I-/In }hleCXV2jB M{4 "ά/1B( ISw fdmiOIh3UbTc%۸-!7='X=Hi0 e zp,`#ݖRƬe0\v0pYZ[TEU[TE5آO#`rw%Xך0A4pYi>E!xK%ImB#)g*뱇\ѺU}9!JFNZɳ&E+,RXw[Ld-K@挹oO a[-aBUƒh+]Epb19 lQ"D:KMHJqkBArd3EQ/a:8㾭쵖suz9 'h p-MF-:׌S.% y#v#iۓ ȝ īZ>%hI_{ċ$#?-O_k _ReMpQqAA2 \hEhfA)'0!KB1G @ϧ%7 %rTK 0FOf_?2jaL*B/I&!80hslLM$x$D4ar$jHKYIRnjɈ>VT .؋I5vhϣ}}BrEk/ E]H95e]@5-}sD|-jB.)ղ&[RͶ}ܵscYF2)jT,ոɖ'RP Jef,h;eCwX6z,c;2dCR+PDYF1kV8 ډ>Ųh lm9v 4c)qn`㩔V\n:IT wEB0&;ؕZ2ø5wc*$'/Eղ藂5Y֘z;v-C.>gNseڂ`'_r]JpMu-ꅋ09 4avw.}Gzb :&[jUELooYuYYySLh]T^ V:k!Db5΂B+OCCġs7kbU$i><;UuQu^H*|+E3pbD,ij/W&\_NI;%Ay7P4-;7z (BTM9xDo/W?꿿O%,?۷ߤxʓ?M15?jvj'{&ь+mmE_r~s}ձ;yN7%{\>{K;Y&{# Gg') .<57.L/vAwbWj|Rvm+o^H+g#eRVvs+ie=+뀩X.[G6r(g䲿pXۭ[ӬQ%4,$B+$+2PmV m}vkCOK'N23YPJ%4P6$?J{ { AlE|;lv oRx{~ ov ov ovhzktۤq6ؑt~SYӨM [;=hہ6բjxdLP4Te29wzE)IQe8g0+A~:>3D%-Y4!M iڒ 9㑋:rJou8KS: 0˱0xZ^6HBxIF!0f|. 3@",HYݟ\R&Ԯ>\O+c")W&"CrB.,g!:zfHXe3&vh%fDτ!rZ'L{QwT|n[fhز9O*``U߾$=չ;?tBb}4}}櫑cl|d,9`6AEKCr[hr 6&NnsndDd&Z-jP^*cF1pbir"R CMUC@ ׀4|1CfS$LEAJij*S/ riQe"Y4}xB4|Jn@7km8EC3rwW_—8ɱa/ Ft"q;A{.KQCi#׶H9UuW}\}[XSvt.>֮_,E?->;jM k:<8 /ӧH9m4PyFyq'xpqkoޟc&Uh y|F ҇_ V}cRNVjfW_=\-Ն?+J- mt79.bQ2ǐȕu:Gנ-cDl8$OJڼP$* ЄX'3swR#ujBd4>SPrP*#*RV/DS7\д4㕽P;]sQI&zpiw#KmQ1vm*Zj0ڄZR`2+Ts$ZK1d%T+ ,9e8#ESL1UHN[WƴSOvȟV[RL{]{JJg1W8dpR%hAV18[7,oL0p'@RJհj̚'zeeHxhtڞ1b!5@ (.2xzEZvM0td u)5Nhb(CgO9/ /(x/[m'MG#e82gIۻK3w'c  odMZ{2<«Mg!7,Hdڍ5qh1198B%XvewHK$d@kˬ}C ńWnmbYO@BL|Z@V\=^ZH(M1u=D;J4z, u7&+E %9 wa$`}ٶI&[mF.ʶ.jrFr`RK%<>Lu5w;iY;IirKf ?I'r[kBpnҵ ]c2vVR)ni$[7QYf<$ouIm|Մ[Ғ;R ɀvHB0`Ol )]c j\R~7+fʄ$Bl*{Ui ;IZ;i3B:zVqg@k_?0`;7TNo+۱5v3#>Ċtt7jZ_{k>qt?ok;`WGgT{6ij=8iuNZ݃VdjqՐ໊ڊ2uDhLF8̜ݻSp]ڱ ; [[~m-tWMYv?F΃iYU-E;#ؿߥA~\>Q.B E%`0{ n'4f[Hʊ#ۚ]ETW=ڽ]R^NnC.M#Uvm{*{=5B0geeO Yk;^p[%f%.c]Vz6&_!L@! (## IdS/홁9R cOVOb-Y/rۺ@T:@M.V^_g@:?upr}Wr697>y']MMiԬ>x||sȋpvL?*R{^}UVww7+ـՒWmrUL[F)`*O)ӬUT͔8Gۥ(oT](.uZ; qoɖ9a)ϖ7na9La.ʸ~d%{ yp~eZk\,]xN8g&7zkm..>{b%-\,/Ϛ*4R:$RNcC)y}66m~ zsk (d|F=￿|@#d<@[Wa,kX),VA/_x=o -[)=z:,#Wy2Y*5^0 8^Hҫiޒ/]y^zEkIBn"BܑP$x1i@OƼWx5 m`xf # 4-'>hk0Ml71%k裩npwᤝF8iNi RJ*ՊtPY"UE؜Y* i'W7ELlo=Q{ݶIU?i'mWd}W}FngENdc ++[Y LJpB=}`C>&}o ;zS\cNvCyO^B+r$fz:tt.Wju5.33Q[K2Q5P:Zx߄§pxWB;ÔVmGꖔhMOHKYgLzs.f+t-cEH]jR !.&<ˠQıRƌGsPEH+Ov5&+s@`~*|zub4E5އ^?UHtݻ͏mڜ( 4T%h~P"*\B)a'#KQ)GIO]#XE~7w;{\+obq#D crgt2;',n@rQ$dC!/iD, t%^ZE^\-\3ƙ!wn@REr]sɇA< XkB H`]e4%b&6` Aa~`{'mOuj7j+RĔ[j2I鬻p l"k>!ӠR1s>V [M@\IA1J_FLTV&L122U=A$gW{ &CN%8}RKlA'$IY-nxD%xDx#ӈ^O35-`XVi S2Wq(vA_Žt;k0ÐJ 7#c0umІC_CcZFO;@*17X t&$';*kڶU DT*NƑʮU.iBi#%\tP6!"\b+lP*朢S}Lw8rM۶{s,fB NqCu<#ae0HaaؚN@mQ 1H!%c] ̶(Kq7m Vkڶ]&QL#vI ` CO\X:=(}Ê$ r`°EXCPDw}4/AqrER;ajHMަvQ-4ާ}Mi=%&\fXQi :(PPTI$g59ed-T/`Y`)e(!J>hb&.' )?gQ3S lБrSRf J,O1q*ss),TDOGҿ+I]-ˬ,CЌ _u{ P`@B~E9馡nCK"R=mM49VڽmH8eO]( s&ZbBsnC{ڄN|ec$!6 YVbNp5+iQ0qf 4F6 FߧC)#A AO|Z_V\N3-i PJjY>=}X)Gw !^'QϋQ7踫SY%)>g,NXdp/z)'ΊN1H 0+?89d{Ѽ Ѽ Ѽ ѼhdRj́r.*:Y9LLVPRЪ2Y?0Aq#YXLҩ c=xS_ׯ?zCn䏎?TVޮ].}nQ+\-nß4˽>͠`voxEK_N]M<1s-GaZuvFǶ<B2Pb<7):u<ϭ*̘Ҭbcxo%YU1P靈haI/e/Rl?Z"J=I1ÏԜ{$<(4dTV3.˔2Y] #r)Hm {4.%3~KFIF:lnĸdؑ5M4Gk0 e0/XB:sy(SMV*m5@K)dXT.3K4M^ZA6̂bd,2ʭW<@֚DK!խ"_+ݠ[WNBJW>KԄ[-o:iW{Ds)]b#JYg4ga+ѨIrOڼ]YC̩$ZFR=̘ @ qrP0`fh%XWDrY;m¸0\W/ÎB2޴ȭh jLK{{2n"w~K1BclZ"MUQo5jrԫհ߿?Vv~2ٱ#C䱴 g"]:\f)MF2W"xaʊْG8JeF>*K}6(@@+V3# `qAn^#?=Ox&DF6ᛧ tc8F{c@^E @.Ln] >L&|,ʆ-MN:|E&Y箹~S9,n5Egӌ**Z4g8:zZ1m-}w)~/g[ܣQww_&Q$J$\E5HfP0;pzI=PiiEW=)Fg~SKlXZ$v|t2jdy?wxeVJ گ`<5U֎{v,A^^7:w:"FXȖjru]ԏqoJ6>S"v&J/Gu#pJ6XaPnM?LDo޿P+!6G >e ŤDV^]G}Xu:&_v^\/NJ}ywTγW>_f~ɔ %9ݼ./NGFAk. zu!7&ӿx@]{n1Ƈr`2Փ|"%Sܩ}f[, |D'!툾h-~VvCBp)On7Jbi#:nnj"4;vFW!!_v)R0{#ߜ3KձE`:&>[Jhrc u^\l,բ-UK٧48PZ}):Wӟa^؍_5B53!7ɾu庝dRfifݏ&BA_XPu0 gϖdYz9Y7+!eu76j] 2gMw>=Z$dfT!s[dzWBL4-M>MeT{ΏL5y/oceϼ1)fN5nR!1dJRkAHZ#hG3zz$X dտpg]+z]<^p:1^pw.^^=:p8 iqqaMcuV?4N^';}81|w Ag҇+h̐l~}K#s:k<+a:Ɛ>ПwqJll~>"l]Q]QE(9g;*CW4|m(](l6]M>a};R|) _FCX>Z6f`)?&@8sTQΨeISĆsw*KƱv@9L؄9Ѳ22?áXTzEv]7W=Uk؄Ec'p24[V|m,nRr 26IeF1wߗهOux$ZXҨf%hEL+)MSщ)#–Fcޔb1ޕbߖ^c+}*0OGݟŰ󣪯pZXҘzl K8ј< irE*hʆ)5'DcM46"0fƒ}w֜~7a0[*R1S0O-tJg,=DF /_ERX7V?Ũx.2j< cZ3pyQUU U3D͜LJm4F3rG_'vVHBvXvK~ Pgځ=$GOIx ,yX.-j{&wo'Ŝ@>* R[pvN`Cۖ{jΤ5VU8gquU(%mpkFZC Qօ{e€Wluࠜnuc&vھxeOPcztSo~-њ f3/СԪZ@Gp(F#zk z[g^| ^udg@ѽ:e#1/S:PJ++SSP) ҿt\ʤe$Qي)SiFMA%‹w&S^, hT+Z@QdY=[O~F~9?YϊtJe^?>xķ J}Tl@K0,r<(YH`8q tGa%=~KڇLJ2EEO>$qhbTGHf꿺,UN[j0"Cq5|-n?ӛIu{irQ+4 2U'dXN4O6z/u0ڠ e~!fų0Uy181 5֧' %eX O͙1Ms;#Sl*tM˳Q!s9{i3dqYٯ7+rOMJp0xt8iwZ|D{|8>#Ӫ ]*GK_?_BJr.UE:bXc)b]Y8+~W&؇`/nBgN_?T^mɦ,R"K+T`D0RB@I6UrdC<.5Q*W]Iۍy1+Z߃Km9i|ΙVO7l0!2BQ^dЫAFTnweuMI2.}2.}ehf)FӴ@LJ5dURR2ZCKW"p&HPbzA>m 48zl.@6㬔&IP{W}rމ:D4AO2(,On՟ fz <@yA H'F z ҋYb{IP3'%ޙ*?IVd$F=e^t*quka*lenWT(SV$Vy( h@Nqh5ُl&l[O(}"x|cbDA@d0D7F\ ,C } %% !ɼv|bm?Ƭ(!Y1w( ?=/V4d"ι (/$yzv ł54 É׋^7`!"9OP IdU RRҒN7YNJcu6WIFt<#|8bDY. 2aPF쉫;#qF ڃ.TLd80'0G#7dS9y.:°w.8T|\`v\T\qūex /l3rmz # .7Qiq%#Ϋ^%R3&#L-11vW-߸!y8hp3X)KLG '_6l aAS ~Fe ,$C%#3P%Һ&pUkal4xyXuFqe{끐&@ ѓ؄gJ]ʻq})L%]0c8]uh@PX#*_8ADq!e0Cw|A$0U :v\Q`32dk$6s6VF|p&j;>UqW40L*LY#(56+M njХv&iyHm*.0b*Eo[ZEzV"so tg g3^7f׆@9(5c֧_uhD͙0y|o dFT<mAAnzlyx3O+f ّҫ:ԙ: FHEK jD<]լy5PGgݘW9D̳h.{*ɹ cHF0ogR4I(`ʋM >'?uss<3:X2&zBC>ԗ1$q׵'*NcPmA$xWtK[Szqe}zY[X sK˹H7UX>oʛvX=|~ۥ u_ᗛr_ևzw.yc=oe̊L\=e90ʭ9`bf5δ1]{tQ6.&Z E\3ܼ"p6(`덞ًB8&yfb x!79ss@a:]4C/@t:'t!QX>6-ZD0HɭMQH(βB2$#pVL hYVJK#y/Oe1b tw3l[8D+APTJ"ϤeA 'P dV}kEKhqm@O% gss1M:;o|9O!zf[C^ؓ,sk*+D'`|@) 5#:ӧkdJ0C#%eFWs aXO1%vçh,׈ΛSk¤j} h۩W1@"ߞ^ SV,@.1vZ?:_,M"=TV4ү Y Rggɿj .UHdUR9&RfiPkm6#Z:vXB͌lEG{D }Tcdt z5hXsER>oEiQ}D*Ǩ}pW oeU6dgM(%:<^V0:Kz&_ۭ|(WQ}{iQ>ᆇasGIS6rm9Dx,IݖazF)=_I`̧s"eRfLLӺnH*\|ƇxȆhVщPӤFu 0fك1󀌬?# :AgE; 2edM2A4Rt"F>BGޙ%#_1\ I"K2IIUd2%P]5b?~yj1?,t 'z3;x[8pBG,ͅC& BEk9a7`tgqƹ#b2U!E1p jS3lBb*΄-.yC.K)c}+,:gl9;ѱtG OU4 S5Wѽ+Is5)j5}X;ط|Naۿ&OiX.vܬv7`zX=4>}{{;K(֑/nAj{eId"#V2VfRGP(eb"p!Dw˧,]2 柇\1frA A-9emI /.A"7-Mb< ut\M͸@#y.SܟjUj(9feUd=㤆lBqg׋̐' 뵜 1Wxz~'ߡȶ%J8<}YwܼfO/1A^l[f`[ _a[<3.bnE%ژ\e,e'4EE RiɒV%$~Έ@aA,>#6$*+-mk܈gKrb[~K>W GWUbNU<H9(y7L2ʊyrrgOMhj㒰y^9"sHPrtگ8L1zΎ 奐h#m^IREW 5xs>!N=VCF_RCnjfI߷ r<šn0C^XmD;d<W/b@fU~#H:D< >G#ug7 +j;E}3`Yz@\q'^!.c3>#~1c4iq56{&7o|lrfkE\{O&DsiJqO{1lG[_kDci{+{9l Hk!Jan?:gA'[o/7{@R};F?^]`Aռz_(zR f+`;|[drRg-3h{cp瑼ĥaWGӃ ֏0Dz8r @~%$mLq] o+Nn!Ox#L4#KN#۷--ٚݒ,;J`}cXE~$' T2&:e;"r 7j1ʉ[kN-5xjpw)>)z#hpM{z- cF𷣀͵?3>;5m',iJVӦS.4wi1JJũsZ{=yxRP: qBp!`H B(  0'jcN;x;zi(J,5oFyop VX2",l;:bo6]D *x %6Í.FI\&eػ$Mwg_+IOKHf!V)f6<&084\̂sGw(J!ӭk$~,[|Sh ukUbܚ)+|yعP Lr=g¸Sv@;dX>L}KL ޥfXW?T+qzXWOIwz~-'?ƒҭY@)vE*Ab:n7B>Y~Z{ X?4ii6 ٧8*@؂ʻΧs-٧9…Ttͤٝf~UEb( . 9Bb dJٌ\.'WGχ~hZVˉa8 )|hHEzh"kL K*Ii`"14B"p)EƈesR& XZQ}F0ko`eixUͨQ\9H)@ sSoGPsyy8Iqi9EO'o^?oL4c74wfw ]/d=~ިO5!DMH&$EMգ32#WbW6A(#hNeD9>ʖ/a1OZ7!.W(Ku!VK]9QV&/V}u1"(vwJ%;#'7躯!h&id? G:+a#՜fA>jf&mֽDz &0?C4&\d`GͪUfL𽗽s78WaދٰGj PK6a8b Ã0rS;JedxJB^ gؒBbg;>c_˺~mtU9\CCTKv=o幊Aoy,izp}̍7N|ł\\6Fqˮg,ԅ}8뛳̫Ao8{rXn[++PVMdW(bR^IL=um59~Q7bzf(sr_!3>=vR+;:9~rBc& #mUXLgLaUoP!:r7vghʜ.Mt浞ֳ"i=N3 Nox SQf7v^$+%.Bylvh(ӻh RQ Ic(Z!ނ؇+HገAjY|xz ݶn 8_r)UvU b#J.Ũ~]Ա:VuhYmʓ|Z:#A4xDl"SC=yAVO>X(똣LAytpR'wsʼn;&p1C ə鄁D䢨?FVpDH&H7uоU)O=u1b "XZz+RŨPB`gp$Ql$(qqՌ*W* 2Sp8+jƒmzjqЮlMn0U#dQk2S,%V(8ˑB -8sU*qL%éQVjFj֪f7YTSAn9펧3P@(NhUwzmTF<)|Q5KoaO|C8Vln0JWO UqrfۅKJcHxM7]y2)V0#XYHtXZQ`LjR1H邛 1vXo :nչyԦMfHIl2bYjL%(.Xs+NWD8^qRˀUՏ"$1r 2r {`q(!T.ZDQ iڄٙԗx΁aȩxI FN&HEyX,@L JÎrY 8u.q,u0C=l؉QzH &YHåg"R$|!4G`ȭRP@4E*Dcp*@ R]Ԟ VA`-1U !AB$ I1H(EV̄Ѕls{3T'=LkWfoC|5/0q6G5!jut_C-hj-NrуQF}:m{0Q8<<( 5?.3{@lMtٔ3i4M&iT&fijd#hec<(5ah+ST1jlPL-vRH%aw)Q5XBB٩ph)bnuaq3nX܌7Eե ) N-|(%<#;ORDUʖVͨVު9.CT&rE{P1XRE~랯z2oF\n^*erm޺LBKa$m6ȯCU*[ +.D9}޾J`ގGa\^(r+L0-wquS|k}?ZZ91e4|Y1M.o&>D!AO]H0K7^O%:Ce%*!lϖo(З,c"yGE*w{ LmZ Eu A軛H`C/놽^֍S|JRT5;k|+&P9&oUj݊i\TVo[KV/ؼoTn$߱A_QMF];CJCdW8ց)ML + D%81R4WԼUb&RyM+mLhcy(N`9LٻFW%);73Cpn6[$'ߏ%ɶnYnY1v0뱤dY$K5[pͨ>Mw%U.ٮrvK&BZqYdN+Tv_Su[t{CcS ,Xh"q&Fc*b&¾L={OugoI}t{Ol4&} h+i˽D#FVE J<0M4%J[/dKjS f4wagGhS̺vrC\PmW;5@>c.0Bd3(Qyn[E$ mp Hr,D?)c0dw#{ahׯ%#Bb%e AV& w>Āc=Y[)M ˓@D4d3F=MuJzy:g@~+8CK 5%>"J>SCYVS(@ 'Dt$ p&p睪 EE(J1zrO!㬝ʲ+ΪrqV\Umg jYԓcc\G+JzRQI(.NTӚlN[F VXAa|¹j^l:słV)azXڮ7mٹ9o1&>Wo@}ZfR4r1^hiVhc95hdeeE(&^EsZN.W~vMhxKӝ;r˹.*OPְѪfTle$hǩ}*{P)7WL6ۿ1ս;$vx2u{fHƩݞnBQhfv,_%ko 7ofI}G w^.zl^p3 d| 7O-`#/jM1T}WvMN(GXKEG,E af \}h Zp }(:OOOOOwo;DuxLH Q ̀e]sux~_ԫߡ*|Ԕx4D4"go'"5#;VY}նo. &oO?tcݖ^ږf}Xa}L3(y2%%Z(,Veۛ%jNJW *'VkV?|{Yn=ťvwoJN6vEo.-[1T]D9:&WVOfWvf릨RViVrBe>~/EWޱ8 _zT;*s@#m+_jR72QtxpL]H2d I2Td=IjQxr"[!M"@\8kT"txլ1Z|gzUjC^7O1 M1b~7Gb3œAfZ.;$CVqb2{b e Id!HOO62%ݑRɕ[,}b=rlYT % Շ#Ŷg[pjTBtb8)*AY(¼y78GP`ro'-dN3fq5SMדZ$4Cq2 :pg c+aYt ևH'y Xb:oW|µ/VXQEhH)QjA(((z,1 c:p,G /:w, j ,b&}9jI09d#BD_ 96Y}+` ]s2n4ϑ"r!B#Py.u<Ƥ؋ Er/ؑ iLtaZ>` ruS,iӤh斣@ʏKN=b`E aN=oXD&λK悵*jIj#r˭<|>srp\zK.9yK ,W!`$["KY [3mckM0ř{Z"Q-<#/8ȖVf. pD KS)"堘hqZO3Mڋ8&̈́ 3fruhX>mQ$e{<(5n[ hPG+;{n4l$jrfPٲrVLڶL)ЪsnuV;Y+ՍZe B"|G ֑ : U CY>gS PDz,[G'jIu"kS7VSܪdLbwI,%)(k "Ě3d\IO)3c2PЛxdlۣohjcG1PeE+Ѯq9/-TNzj V$ºȍwvJA ^St$u"L4M}+,z*ŷlb ̡\r))Ԃ" AӆDcb`u$Ibw8G؂o멳 ^*@'U*'N#ȇ'g DB Pۋshe1֚E{xړjZđN%¾dP䆈HvNؖ]>.Ƥg)): ӑF"ZL팑զ,كc)1n 6TnȐӓ$A9L;y(SE؅< 6!.`l2Ű\M K \UR bXNR^5]OM$=g`:q.jcrM\67*og?'w|r SFB3X3ϿvQ㏻4r56|5E|2Wez;< ];H|هd6:it~3ی6i9q5E/M޺I(d ؿ oAݼQ[4tnϝVJWksgǂrٺ'?oܗI=&쐛/\<=F6eFv$En S&*`#ꓴ?t j;r/ir=$ǣ ,Es2UX#V6g&=[fJ^M9 Qlw>.ݙ_Eg/×L(4DuG|j߁eًFr3f.']rB ߃HR}֬wЀjSjdWz ։H`_yT I*yb;y:/hyLeWah*gw5y+R:w?֩I Młf`~bC*BF~) l'܆42O%0bXi< X.j$tg^>}ܼF|0`! Zɓ/Jwk]E$;^P^{k~vv?H}uE3;X[/o^N:VK&}u=J0\H~*E_"Ώ Ul|ENRAж Ak 17HW_\ 9Q_盲_oIC(D+GfNa6j@Y #;xVhz%〛ƦS]w17 bLbZw2㼸ڛx; )O Yw)z7itT?PO1#[!A&\?Dmc_aBr-I{;3CaUSvneƈeũʮ #SړYVrM$=o,m$U Ow"\iuOKɉfM3Ao*<ɿ~o8RtOʊ R!Ivi^{ cw4b&T8R#"!HRFFbMVkɐiG:.PT#$JhۣY$;FF#SF%xj+%r+/}[eLȡj*e- Ǣ eu1H5Ag 1QALIBM&"஠A/($fP[NC檩m"(+F2+qMwu+Wu$kb hkߧ1FD>e F1n1 E1ŪO1B{pTalfLvn,$D)XLE%nJ C^}012*D~9dl z@ 41H0}9H¨a]l5Us`8tmd?Yܩp4䶶z&վcY-YuK1 %4WŒt.̅cc$Qog$.`ƹWIr5b`O*р4k#.9` j.}r }p\&x8rw#)2)Gr撎AUqXEAa{IR N6̕)h~20>7*-R--Fo%@5bA%9Z=(GPp?4|{EKLmVl Hؽ ;dc81|.FzWQ9R+Tݤz@;Lb/hrőnm֝ڰH[!Um7g}MkBT'&hw ժjj_ !PݤۣDVwqSo?vYL*(5-a< Eƌ{cJ0m-^`XɗGbs#{j)[*\cI#FU*UQ%S:1>zShD5ֹHz91Dyh."\7GMgKˈ6U7a P|t:3lSJI.c!S%@eH0{g*)c/$tKDa 帤HwV֬oyHO }?ff QH5"f(@hrk"#'Ir콜05a=t X7EGt; [GMt'Jz m)U[ڎ~t- Se!/r]vqqYY:Q%;TLc# M(m-$ % ƌO9XRICEa@OE,0<6LvKv2(P}Jζ2o:}vl3J,T 1XDAe js!@dh-W^l^`\fvmDuf?@G큈 a4c A;R Hj xXc\i T[q1%g: LƾV&*ߛtLdoi*%xّY O'A5h]b "<1O~{,q2ɘtO)-7}?,ܢ 3\p ly41n]<'`a 5mkՒ@W`ɚd P03{o<0"Z! wL95*+Lm;|Ɲ>tvr}YmK˽HZ-p0w<$te[*BšYq(/Tr-~q~TL淏0_ݲԞ]iإ,qb."TL^94Yr2SMa"2mmhۘd{WQt#rm ҼLC0ˆ FsAf K$f|29|>E?8g2Z@?_q O>I2fyؕJl4z6M})lAhw/)deuNiii.|^_C*:&}=7L}(Y܂Oc@̜ƁĂ3J1Xʖ] Lr6F.O.M OR spjNnba> >L.2ޒ_ەˠ[Bl5t.Z:9Z`s\a=J@521p'FX{cbuQN(VB31TF**@K) 4>_ȠPLzoǓj6N;lHUvy-X VQw dvgk5Nʝo̶#W\bcT>lcN;vcN;<S̄)" L q ŌF6 ߆D0H1Q *U;U;_RҠWRgw )mB~7tʨ~[;*{N4eةYH ٬޼)GL%≗d f)&2C+@rħPAR=k$ԤQ0gLȆ6TJz5R`-UcH6]u>_t++sO֙ ;mSTLa[\I,8jYrqUEZ_WU2rm{i[~p}AfFdx>-hZmM:atliTSsfV[om0s!n*-% y\[=iBŪj,^Rn2+LZ:~7hqkv3=Ls+_0^69O/#89Vvq8G6vLZ|#v vk{ 'Fy%sy@4d46#ܲ4{u!sIo߈ K_D ˄ fvˁ~ph~2p ?Ȣ2l(خ]h<.Zzp!@v1a,>F%f'sloG"E&PsZ.)to&LIFl}%a {$5yvB`Ja҃RT4]_1^¹%0b%9֠R7MAh?r^!LIAJ|Z7\IsQ _/ ߴ.T\n(*6+ WPwkL术8k/M0Ҥ )MhO㾬B^T Ʈ(/Q^|PjԵ^2JpX-),#sA/]IZilK0vrVhː! R;4}v;* cN2b E'HVQL QKR;y7żXqBfϻ;|ђ!w0h/Dwx6^(D6l<J$b;LƐ0abOs{" cR C JXsثSS-A݇ E (qy)DR4[<"!Nn!¨PNhSZs]4)m,Z h'rkڭZV>楶5};L=5 /n4黊u08t_h%'`/(7TӾ:ltc_Shsdڗ\f`2c^L>y1X6鄒/dhۓ-{_w LM"la& llMk V &"t2mrzP#H곽:/6(]ԍZfScuo'C's+W u. z#S*aߗĦH*X]6 nLcGsvz~6FZt & .;Һ$q5:<~ rt !<[7^|;U9>@D>-~9ھrE*z)*]]mʰK컋ϯA쏳<^ᕝ%Zz"}G֖9~cM>]> \O'Mմb|E}HU[OQ~3=m/&U؎KGW!H3 *rգrԹF_LiP>^٫ǍFWGh!ՔB^_eZ!auKM˳N[n F;0|?(G;GI3Ay!1B:c$;B;1BS^ ml054;+<$% PYϝK,Q:Q@["1)A35oooW.&mPL>]7sisln]' t%^=z)*  믷ƛr=yɂFrW-NY^픥}mШD8'ܸltn$Q .78P2U cKzɉ9!h7-)#ppgwgy ,簋wg;]ƼkG5A)uӿbaOT$@L^O%TY8 &h f'` e4α&ׂq.EÔ XvYraѸޣIBV7ƍ<%DJ 6P#%\dKYɌ4/iChk:BD\"r5&;i\ d*.|TeY=!RFn)Z JK@KIݠÄw%!w%/Ƶ}"n,="4o  .{t(<ZI?ڕ7eX !j}taiqF$7xLBmgGM~؜sö߳|VK??eć\~ywMN6fv йa6fkk j.fs~bN@c !r+Ip&JPu}(=nQEƭW|wwgY»O_^ˌ9s2\7׳:vs@BԜoX\ܯ٪Tdf9/`u,=XRV.>;{޼8woV~M]^h{~mC/_r/O%iWv~1;_rk5©oDnLf{FA cKR̍B2ﮨ^ZsTNy)BU46ל!),\hP&=DL,xH>i"@^p偖ĭ cg*/.h nuoWq VZD\>Q*:'M򠩶\^ XK2R8櫺`˨c> 霩fĹʢU֓ 1$9'Ixљ8Dh HJ(?8Ub藈 @Q0"C=c3tیb6ء{E,q޷eVibA%nf2"8o>ζ'H>i:ZVEa u?5TR[GA@7(48F^7@ۅ2"IF2 3yC˜7bO5g5{-ًM om|zcb˟*z* UFaOeTWgӧn19Ge}cF{-G| j%I+.$&ulBs"Bk.UE*GM`#ԹLQ%nMEs 퀜&Rb|mu^uz8q 5f y<z8MN(1YsrLDPfbwrA qm[T(Mjh]Nn6V)%n32W,43sC۲19Ʌ$O_[l@ Y@RFrȼCW!v4U]Vˏ604:\5DcDiѨ(el`WljHI9UMp4mcjP&vم$9m]7U|+5fr&k_̵7 1f*t\v"7ڷz-N^h~KEWt'uLaڀ͵l0:, sJ e;vIOwxe Wuoο,6]' a+o }kq֤5/jv~!fkd׷rf (>LBZ-ٟL*rİeKB6L+dIyR]S>*7j4x" QpNȠQ[.Uram5nJ%גz5(kLgׇRi]eBTFrǩ&qM@:>呦*Ǐ>Ϊ4/vMWg_EW?ξy -Dku \b8hL}Dы%UTJCd -z!pN:.U*Q4a%o-qpy9PVeYHBEAw; X?U>$GNr/I/&CbdP9 W@8]hlZN%PHDOB)y)pE4 ISp(!zp/LfY1"NgNnn#,`4m%I`]K+6 8I?{1/ٳᝬ,Eg=0(\b/r_,JN8"}bXJxXDK&󖢩.yU j5= n.feuw Pjc{[{cLonS !dҙcuC$ob3L/o?̗Amf]/ER&TUI/L\!T`ՁJ"JլYVƌMmuNaf<–oZ)VCU?rҕƤ 99*lv`"zү3+R5-8ΐLV)[JᄀJhq`饆/E-jHk5a11VYK:PtQB,]4$֊i!k@ezXjceVRƈ0A8Ol~3 ~ݟYy1k3{ݻ 9YH"ŕ;C*3gDiBh. QS}$%J[xp$nC@Lif1UZԘx? u?0Zՠ֘erUD7Th@2z !@(!T᎟tzlhz"p , xV0'pUoO)Mp *Mkܻ> Sʈ=,JA2fQ_f+KƫUaLsSX!oY=k!->G9K ;jړ[ʺot֬IfZ2 Qw{!!1 ``eORN iLnk]Llٳ͘\S |@xQ2h6 C*qVZ VmR MlhS$FjÛw9BHmnP=@{40Tz=F2wI>ʂiO A%иqҲu꾍u}$5pn#\j_jk4zz\B+e% qЗ$^Q5%M⸆;90frQk:Eh.r5@^O'MYROhQ) ~jYW:U =!(6":D{Vr u’x-PRPBM7ҟgH`A6d[K{< ABVMb 9^ 1b)4"+P[;Bϒ=ɨffg}"|O.Lܔ:<|{>SrJbAK}u{cOvdgOvdgu?Y=Lj-q3nLҚ@ gt%2ԙ4ҽ<9HnCY9Fr,ø&Xz+ c>]͘lVC_&m@S%w޾Tu>},cζh0?[WF _kM$8Zi^:Wx Z<{$ޓ!Q#^jHMf hb$4mL!iK2 QrŸ\ݜ<äe|*Y65pdlYqkoYΣyY<:QgU('aBUR,8*ˊiJkRCxem(k˚x!Eg>PqNr܆ F4p\~3azXMusxɦ+G'I2J \ݝ_NÛ?Y?.?>C!np܍}D_Fftn]Uʹo=Cj6; 7MW9k;~˖+|Ckpv҉ldyᗏM|mϽOp[1.ƞ[@c =IŶk =L ܅Qy~օyZBY$vl t5,tM$EmZ&1G7ѷqKv7Ln|lNJ/]`y lHvClYc/9fU֚qtd,=֬i+tgBP<$SV(ykeDs*bBy% L&,;Z|qn: m\_ ?Y H)ТmƫBuh7p؟ &5&0C>Jfb~4s@GZ-k{6[$q&չT=sA,i];J8#- ]Fa;!zKj\G(+ Q8et:[b]a0)υx+9O|l\ *K%,g2)w 1` 2]R._ AnZ7ǛAWSpkH$[ I!@A(Z^kzr+Ab;:V9}xkã*SFJU~m֢p [Y#; KꬆZF޴D 5V&dt^Opӣ+?Q4M7اEII@͓56fsӊyݟG|-3T&SeIN&>+}^ewoic\ $IyoW'=hHK@wo'=jtYQ{xrN5sG!rTeNܾ0sZ >Š&`~l^E޼nmX/gU1oj`/DM75whVH ցͺk׭!fg?B=<-KdN'ket9"/rN8{h|'Fﶾnnã3}Eз! &vF.yREWJIAL9@1Jb 7#HHv_>=:W{ F ϼmxZABp[A(QCw LQ|aɎq>w,L9lF gښƱ_aej6WqOmmgN_6e#d/@6- )ev$8<i6wEmW^gv] n7͓H)Uwi͙^|; Χ沊p p*믗6RH 0 ASud^rIɦ/,Ƿoۘ|@^*joԭ}9dL\^{_Fcԋ )Xt4p7fq\!귁 |C_fpc_(KID\ňܨH +qPKn8NJWT QZܲ{툪mH0]yMF(kxٛEa ١5tU vwi[+k(bI' q)=E{ʟdd K$I,]̝S)mo|0667÷к[BBHǑkh1vfL|.Bg%'xe]VL'7DN,Y"|%mWФR_ҀNJ@vTC}IoO8_ߒ% 8i[pIQZ! \iempC0 F(w[Ǝ+Z8Bҍ|™Zjo֎Wzѳ ^#yQ^( 1x:ulb9;'lM98,`QvpFjq2yc.cF ~;3x.vjqeߎPU,lG:näb9 hwVGvV4xq#Lq=HI\`p_Kf0O_KY]%yZAmֽ[=r.X=d~%v&h d$ؒ wC椷,kTA{C{ 9nOp%Yu(*4 {'s*3J z;x`y$~|sC@wS!nz WC?{oQjeV +: ͧv+.I(ξ8蔵m<*CF̕e4uqd[7Xńt93ջFJ^ut㞗Q7ʚn0TfFQn _kB\¯ fI',e3bBH M_+]7¼~a !dѳ_ʎaZp@Ͽ,3j l9% B3Pc$ak DPj%4G l(5u㉲n1MWdQdZB%/ 4`PhCXXAH>U#& \IƄ*Jwل*U@bA0A+:F3Pܫ_-6}n޵;I NXz5\HSZ;tA,+ĕRA?B87;biV N89=)jqyh!h0`] 46І:!,\((h`6Xt+="_,^B1 ? 㤵8a"8 c%Ұ!~^ 6~~]\yS&[x͉PyztrczT g@@* .)}ֵK/خImkKjB\klCsmrJ감* w!Uĭv=K?Z{=sՙ_j2wCA]}r|EX>no M$?'lDr_6ºPd?{j6ic,_\'[c=_=ߝݼNNA"8>O8{Ej;^}Z ^ h+ݗAfvT;'hAM#}l,&-ny9'4]z:7^Ek*4]EŌJ%"lX9/_ɉY(AZ,JIC$L_}uIw)5S(iVW 2ߒ#e/Yyg%f< jETJ8=fi Eh-B$4A4WAڛw7°.gO`"1Q\mh ;]:4Hzʧp@S?ZJ3. % eE =ؙ<mQKJ(ŝGg_'Nrj6"ƜWh17VOF]I-p 174[;7{dL ǢWN$V m5wWWvἈā\}^#o_~S"04%h)t+z'Qa4ϧ!Qx0ͼ$nw&:. uNZ"0**sz@;F4nnr;vw7cY8/]n#痧T )~9sqw/υ\ xӹ8R|ZiO @h5Ļm4K r`p&BLvu.͙jA@eUy9&JS?ݺh6NF\jMgm42&ITnj8ybik z ]~l[*Wn6,dٰ]-z7Y[Vm`կ]ej.aawjulo6=z;x 2א/p@I)r5DVZ$dXWӇ隗?_{wQwJƣ2wm:2mWZkx1Zu}`-L] P)SoᏡ)D$bc #Z*^0!8YK %ʵ/_(+uJRj rJYb*RӤޞl4~ uCRd2` !txݎyBū{%y?PL C B`${f<-좀!M$Sw0h!;F/BH7\Qz#ܭӫz5l#x! :1h'!U)>B>n#pu5c9 ;k].bgBBlص (ʔ7كGlyHL?;߿3̀ M%U %5Vc4uZp?" eb5k^p>'ZK/T'J=.B%J 5EbL򎁛]I=jCRTm!PNq  wq4G,a,3{;&F)O;HռL Bڻ2Jieg%L~~R~T=˲Կ.ïPd.zd?G=}ϩ_]wﺪ|h# I )ךӰpŒ @qQEPkd\g餬 f_6t<y?{fl+eK6_ ((@g*MWuꀇi%dkv_APlmʲDsE0! Yc`|?rBy cd a dr[M 4Ӝ%)o $ӡ[ #8-X"5#Vp$EdA(EVO2K\6[h0nJX\ӼrBR#M.3M[K DPn\xdJ}F-Ƣ `Rruwݚo YCvz~ Sՙ_XOyJɼr%|IYhՇэ-kb9y>~8C,~KyAuͅf&'!*t8 b;'Hz)m^P᧋;6C* zįsv+"OO} U ] JVe(ܣy7k`vjpjT{C{ 9Az>r,a${ +܏'@x^+a'; /;ߴ -_K|pjvvC=Ty,t՝3ykFm>vf!lfgedS{e;Cμ7W~BT)aWij%IːuI>d&(&, xÓB `yT0fj `c*,2)60AۂTR"sDҢ@@[\Vط;Ӫ%HlǧUn*1dUxHJj|6Qj=Ƈ,Hр N. bO,`"#21 m%+iA;S_1+, 7iMD㥂XY4īE 1=$'08SpC8myKTr>{E9$Q%c'v1}16$<8 !lujwU/u]7LN$Zio o7a͓N"TinÁR7(P0` w:t>z\OH=8Z 캭 K~(У u  ͘ cd=VdeN9*oG5%D>F}U\^r{ڵ8<'gH_Q&'TrA]M r4 2إqTI[0RqY%B'qC^صGb5A kc^*p$SH_/%rJ)#8*G=0c 06Xb3{VJxlPsB1NdXpZ%A 8My={uARcEA}lثWi&B4\iҢ#'7^wEv+33MH%k9 >7hht{pﺗ1,Im8AQN{ێzikcquurVh֥cLsf)ܲ~SP/n?+A@Vy%eWzmşwCfgq>?kPl$S7>0P؄t+r\ZE-!=u ҝrC$ue|n^˿ ?K^aIDoTR)eX D[#={9/h"ρ]~vEɍ搵A'cL&U|N͆1<^Bue-|fT[i'geYۉMkG8~&Y?ӽWY~ѳ$29k-ɒ ђ)В^e>)VfGnf'GlghV8hg8[]9rrz(1vF #'m~ln~HVA=XSذU':wn>`'@L$~#L)'[2ĝOa\+1rUGL:_: 3)H5J.+B'23)KZ4`BDEb2DJI@o.*48EN=Ϭٶ4 kth&P(H}PjuLYb =mU!}YA;35zF18MTXl30&G(IB^>!IjOA7zoo_߽7Lx6EݛABq4e`mN*R)ɭwRp;DUF"'Hd>[5`% _f1p\REuuwTayʥ|VJD7?#/̓.)JE456siH& Ak=aݛJ ;^w_OgR̔431ޒ6r`\w7iww#BŅ\Yˎy ^|cD (YҨ˵8 W׷<.,| ,i֢wM@47XHApHȎ퐕Ä()*s{v?;˳ovLV/g/򬺼DH: aT1x*<$ e"+WN^Խj_\\R4 ܁V ǫuצRSDNB?2ȼeoj޺_ n˷?|ÜopVHVt8Qši<[7`,W/xs\v kLoPyu1a&,Lf#aTEVܾ[L\2qk /Th>f3PD!@3ZW" HdACpR)Q Ju~9c;`Oc ONb2yZc(.>)$}ޙsCS]vsKI+&'MijҖd[q4NjʸnC҂*ddGn~, 4rO^]F;{ܞf̋ Xv+_A$ zDO_(ڥarvDn®tYw6tpcW 0 &wvK'tW[V#")ew~ಫ^kGb}]0AC>\MrbTwT~)}Tj{,kcǿkT|\|c{<&yz61{gø$4kU7 ze z 3¶.<\|v:=r0ē`qTyC:)G/ 8sJp3qDssf A Qd@Y )R259%&Һl-ʙT~ЙVSyd |)縇ndzL.1iFPctvEEH>8RZm`9oKǬ%KWݩtSiL=Ԥ[wEfǛ30rnKMaC; 55L;S[+ڥli$y_0HFäD?A4֒SxN͸tRUޙ5 w?$iwsqEipDvG-FC5t߀t"}&8P96'rCai'"v((dMMo8&T֪زv&툸Ð/; ov-"!|lFD1:41Q!5^ȧL\GWe!Fq{{m+6uZןM.Za/:8E ]tؽ57Ê<´c {0'3Ļ'6GfHywz@{@JN˜[ʉҍV޾iwpKdjrhZ9Ǒ?)s<;'EX 5F4o2'%8ѓ$O"5H2ǖQ`?8e{Z2$ cDЪ;Ҫ$vR܊DU9?) Q"z O4" TfElmLr|\o\-( Xo@~@o6Z gBI6.k^Z+ z)l5@p6 $6|=bκfNuFRHk{׽ԲhһWt%PR{F/Wwu_yH[{epڒIvל  Ñ'Ie6n4kLCx9cR>%ɰP SQu[o꼫k̵7<5GƲa5O@wd˻nB@8}\/'ZWSH/ :AQ.M]Y[ DN0׺CY&@MIK>vIjN#~2hR{L<^)rMmMGS`RnM֒Vu9,2-;~TUgMYɸ%{;{, ILAm/Ǡ,3g]8S*`zG-B5G6]w]Ŗ+7AlӠn#xgYU\ YHkkr6Uo}Yj Mn3d֒)(y}Sjf*{q1q}"S0& (m lB]5AXyt2V^3XQr)EkB9Z'.S~eiۨ65l^zqstA^os7By⏃F%_ͣ/nŃ)m&/cv!?V&8yRKU93(+ #cBY1W!%:x{"]t DIp3GB|mX9Y+nѠV2Ŷ^.'iv^_j +Mly 'T#&U;^&vImU61߫rCl3c0B>Va+ʭ7BJ319yzF6Fr9Aaqs֐¹isz0f9<͜3X5[Ptak>w$5C@ߗ_?'L1I ho`:tN&Zh[Z6r(]>{P`5 ˾zD޲>u:e}Lp!K6~IIPH-V$^]l ~AH~)e—1n]Cʦ,cH8Yn]S*N:GKqN! E. ͲJD$3؀FDĚs9 vI oB䛦)Di"@6m:˽P$cXW#Ah}WuHn/R/=S-hTfǻ0@dRǧJ./']B HC4П^(^)i҄Gv8})!/:y"v܃KYje a3E))N:"yǢ!?wNm|t@ ef~nIhc7܁ Q洃ߢ}}uwQ;&@}ݼ珱jolqNliA9"tHpyӇOo]oëЏ݇Od 5p´@[&=hfrjX{[4Ja|8%ktӳ[5-ynm*$Q77a}U'w+} Xdy2PkD7kUtb pO8j-+c=j9"Ծ"}@t p)+YdG.l)S`ɁL٬N7ݬ PDOBXܞT©PVaP6HP5zd#)7Ywwo7ȸ4"*%rZ Y_ƒp"xdfv;V>N7!"ŧE~'rҿ}GwOTrAv2!J}Jݿ|$=LYQQ>4Rheaf@sT GYG_nc,ě zH1fXB0bZ턈!ȣ =R )m6Kq݌k WS*t\/r!pBFN6h҂Fn"ȼ3 ٣?8Dt6IH]e|ms"` ؠ>[QM( ߗǠvK倔Īʀ^V+&}B6.۠}t{^.Ƿp̾$r+ `5!+\R`TL mFI̩3=m䀧DZ^Ƕql[eR7=`boR F9VNJHB赢_!R]R[iK;ŎDeȊĞD+di"ti7N1kw 'lfS!(d9RvXTƐ'jZR8s2.)iOjגX6֜ޮԉiZȵn)@ږ!crYƒFg0@vw}:ZMX=xM{Xk+gD#(f(R.;?s"i1#p1'&͂\ ȓ r;b@űy# цOn1Qmٌ^wZF7OϞ]v1FvN][;BӍn@nw2;,h@;cl~[7R!enfUe4 [Y?ʞjI!f=6)8P~2zlRpnάg&8t֤ q'dt(;-$>Z6)$ч&ĊcF-ckM99$ל6OϞuq5B˗q//wqFt%cIG18.K9)u] na ƨ-}"+tւwh\1WZs\-tpa5 n3 y9<궡ٱocJkimveEõgeQh|H"jAk*)!hQi(<^TF+m! JA/80 N˾;>g~J)几FŴd٣GsL ~I9-tŤW`'*(Z7T,n$YńP8Q9 //z`$j0%__tE/i0n>!09|in,% x1 ziԥes{Ki͡dưA%pm,c\_QPͩ;r 'IxD 5/i_8{nMq?)6x6 b52f MLd.cpR|T^RgbA`2y<8D.=@jQ7}hZ` ȁYBQIM${t9yOg%ӒD\CP]gEzc/ HP!u3(RW7ӎHA J0ɞE ,'_׵#rAG v jm&̥n \ͧۊ$ys]GuO6 >~zsۍ#o<ǫϑC{JnOүi-/J&Օ}hL= yqH*^mfnTA~9O{֤ h_]PWZ3`wwq;ze[)HCP%UԆuSR՟qsG[KBb*i=fQZc[VUR|cʍj)I[PW +hEc[JΨE+Xb|}j&W l0GlN XvE(3w$OMus>˟#vQ\]x꫔j+~D'ԛfۢ߮^9]E 6Q<{$rAxـ@_~sL9:*sV_b:8zj(sUXXKkjg"jd'b̼8) OwbPߕSV|eg>ObO:4Bqb\7ϡ\Ng?կm)HJLfX ܊18HDSsnK//Ɵa a>{i ~a7e°cDA 딒,%h6@<܃).{r%/R_uYqt"JZ1=z hrYU=%ZV&蝹)f=V/5pmє "^/ ;0u9woITfp)iк$9M6sC @r4%Ůݔ&YlmOy {>\$J..7Mlj+y>^Mv - Y %dDlm)xcɐ$Dh't]>4J rB(̧ȩ)ThbepT9eN.k#( a1q(3g'G58ѠPk˕Q G-֪XEkUQQNZr{Ar=i$o#BY5\]Vߕu%W`g`1hQ1fk[yң7ݻR+i,3Y6Y-GFfo?dOV%dL=5 |6>;Cv/XtBzJ_-S+˥|nYcDW1:tA(r9#y_ ' P^]utwlCu0H^al쪵o !e}y ^`}_VB`Cꫂmc`4E{ $O)-mA}i!pΤ#'R1 g#)MDX Hv?ؐ<iTS vvN 3 Fzbql 'LiU|D?y?5+.ҚupcD)S^żŘIJb hoFٍ"`|\X.?1XĮN1Vϗ.ܮl$LtT-rnHZ3vDwJLwHlu0Tle؂Y!|ʷ5W_l3BEtf4d96cQ& mw^cMV@Sa&;X)>m:x"x'tK@{IPHA}&zjVr^$۹JƝFCR aHP! yR-@m?P wZo%Uَr/6FTO5!Qxq'߻k['JPYvm,-[ZyptDzDļ״j"bU; g.*ZDz?_]|KHwZ87A9gg4up~.OׯO_\tyyQ>,.ý}X=~cVa"~B^:sͯWq닕#ޗJϢm鴁6 CC+O}Q˯ ϗkZjdy ji N,3\wp_.1e%.pưfYYR|P:v2{XeٻگAv`(`x@^D<#. < }ű[3yDf_Afj-aM5J5>X^} lƀ ',\.|`l#5k "&x6,1W` 00?Ά"w/Qޞ'`QH*ccbyRN&|/)!\~;mA qoZ3*]I"-s|Z 'L][eV$\kA}!rFPR_xRBuڕ~)CrO︄X!]Qr?JY^( Y_k]tCSƭHuTHz V$ C1޻皝p dg< ;??0Bo`taOCۺ&pSt:j]Fwx/3z(Q%Ae~?CϮ,RK U7[%܄Ga@#l de?(}l^)Ju^f]..ӓPoO[t9;[~zRG!ɓb|Vi^"0f"%J+ ;/cġ2LeCB;ql KUaY1AaU *U lVD99QVNbʈ4@JcYC*H2z"`>Jho{m'uKaF!ZN@"'Ny cϹ({~Tȸ R%>=aԳF`RsICnmΤEU#u $IIH* կ76BNE8!ĊgyT /_>"{ ظ Ug;Vw׹7vz١/J~iU-,WQۇU isJ Z/jJThW(;ˮ )nY%v~ފJzjc8ݕbQHV3نkW0W`V]>3xҤ^#@Tw5BVvNxn@`J*cfJ;ܤ.iլ&bgXc'#[Cf=I0w~|[S=HWPߚdA}7 3IbA./7m/Cr13w|!LK4πL02 kp;[ }8S۷urj.Vwį d, f2s{+NBt]WCY"Cr/j mpWLd-(u4 HĽ(o:l.n2iZΔ@S;׊2q۾Z}^cMnBI";)>CAh٧ ! j "eo?,V$HA-(Xyf ˝ 5,+JrQ !K6o/C>-:˪zv27t]jT躗醁ۣ+z[d-«L邮 Wdm7 =֧Z=ySpu+{?$%̀ aqUh |5PNҼpDQzy9:rP,/\[ǃ. 3h{oT ^Wx2HW2VfN?}^]맠P`h?{Dl@G4㤊p;d~|3r-'x\a$%rJ@'eM_vp4{~/~s١>zf! 9##ݛ9cx$U@bq|M)30∩U~&}L# o1Jqۈ v򊸑*(7slRDo9Dwͩꅂ2&ocWW W NE9Ss0 H]AW(Qꖽa~.%~N$eI;]uMY!IW W0U!S]LMՠ x~TٜE2M@ M5mx$5 It2VW.4$2"cw&stʏ1&@rKO˵= tsqxJS:E;~ʉ5dG{M9,qbP6EM%v ʭ}3l1X*N+#q4[*fA_>6 NHԐ>AXaC{#W&_83=Su_w>~͝td/;_ovGygߣG/1iy=Ya+[U'}4Y!%Yaˑ@ei_p!m@Z˜Ja2 l!sșۘT+)@ ! !r@RBܿ7eޣx8* 3_\]8N^]ů|`])QsaWsVk^>ʢOsz7HURѧE,Wfn>V\ZQEʜǂ APJK&VYt.-7 ( m)Ta!?t~lR*˶e*5?_J\Úr,ayɑ%e|q- %9-Jͱ`jV"` nsJ)UzzBWۅZA2<:[\Pr~J>QIo*߿'?j}ۇkyyyU^_קZ'U>)TD&tkœyœyœyœUOtmj~{ s,q BΌC`$w\%no`\UIRIP0-˱dY`D%?͔\R+ Rmoq89=T_}oP:3RQN2،DZ+dCNppjThn1.~A- Ғh\Uy!Q1) oE v YI˘6bPl---\()Zgb֖.DP6)Lys vpTko.gu^!y2WȪyF-S`⩏ځ?LJ3S3*ŢgTq Kq c,yi"*5U`Hί%K-1ȹeS餶 m?~ s`.ޔpE<1QPJ(~rR1ѕ>m ooLO4f~\Ь:.h#ShMXT.%("ltIDcAQtr˅ڥ[.~]h|u!dqdČfͫg;ۧ+uÁo80 fpzi; BH"ua`.TCn Un ?cmju.f(ڧ,ĺpb 'fpb 'f‰Ј9DɅr *N+YYJ .wOcb$0+1ĹBGŏVp0Q )rQYoZ1h_"ׇ)BKVfkƻ*Bx+b QێDDVxyF,a7:Fe Ȱ6K~ s"+`!_m;2@0mn0jg"G54`X(,( gLX.]z|Wz~@=AӞQ!ާ г0[{ XPr%- l2[,2[F^R!}A0vD 8=- gv!X=<$<)a@\ 5QJϑ xh64E'(/QsQu>)AN TK`<ْ@hW3.s pVif` "1̙دs xs`S:^]RoPEcr\~E[CsچR0rL[9[cr &Kq^P9(`_w 0b< 7@RJlnSd<.9 ٟCrWfy]0[rAZbG}n7S=13a=3a=VvDav]RN.^2:PwwE#c.D?IS7Q2כT17GI,?'gL(t[჆=! "Bʒ<&A4L5p5tqb+@>qaL/zBz#2ٌ8ӧpE]tL %eY-s\-iy 8tT-m`pJ]a04BW)?($"ݻc0=P42KarKIȰ,6j[)K2Ŷ‚Be y*IgK@E<\.Wx›B`S/r .R4'lj:]EZdM()erqKŹŢ? 8e[l. d2{a^^isqjbf#?2OF#>Wmf,:ZSIGMO%GFO)k@Է* 7噉X7] 1X].NedF ϥVFfuQKC~쁪v.: FGi徿zh}ưYS22V8˥S18a&Dr5u΋uR| q^E`PV%$z ΥA\Z=I&=e³E=Y)8XUY yr]yc.EqF+Jx!:VJD'Cs)bd(2'mv!v9{/csV"6 [bxɡ Vl4e =X!A`׍%-!6,^k!$宙U^+Äu`Vy52Â۟ff"ʪoWRkK\֦L ßS1kL3Z 1Vװ>in,D]^Ili6myymgwdl}M`8u|eeaXCk-꾮x,PAֹk2sT]f+_5|e֜cy]f+_h"qųo#g 0œ͇ά3t&>ψ!VDeK& a|넁$?ЃKiMsqs 㑯ǍSisܸ#{npMb$k&ZCȡb# BWz;EM`;&Z)o b34q->|w%f`2Ǫ^I1iXQD6ݶc ?9=EeʉnBTޔZ {rK~p R8kG? 8ENs:?{0}>7TuZW=p1Zo.|zyů',NFk(ćmDOC8޻##I-\?8)z|ɼ'$|}~K3 O|M_9vHDCءp3riAǽt\}PӵF(|ԓ$,H[^7Ȑ~ݶ$Mg:8 E?^w",ɽREǰgxIV4<c7pbGb}!\aN%wDWvzNrt4:$de(sp 8R~?_}\ GO08/F()ݨ{}$[QB1K.^zwB K?:t:/G`ϧYFqk'k{_8Pb}HgY@l]'Qgxpa:v/JTy z٫7Ce+p:N>_K~!6qc>Ķ]'4LZ9'4[2}\zYc..H&a#Ǝln /&=#Pbarb-!%GLI-@LjiQc㩓(<#F29 p ="" 4?~Q0q'?5cLњ<|j" BJp>?5˔'hICN2+[xeGoޘ >}Tnr܆>H/y-m/ xL_K=P"$}1.L-]y}4tZaVcf1yvrYl t|g5 ovz^Hİ 8Z*qڗ0Z``a*:7niu!$5!$54ɜTmKbjsw͹a\63S᭍e$?beWXgj$?7f><q9Aڛ=hRsڔ/y 8l\uFa7 A#} fٜ5WF_SEyDRr1֩sJȶїOhzMs(GV5uV*p|*xVcfICٞZʦٺ/߶$ iJ%z/7uPffKVjm|xzR%zͥݍ؉.-k5[@[=[LAޢ quwuwqÒ$_[WV( Z-d 5%iv`/߲=̐t=Ch1\M4a&% 5ߍ88LVc*l9 R8Dq8*תYl7R4/vbD!KPBRp`RTWTmK2gO7S/S)&6/TUS) 6ҟϖ^0st_8N 6s߄ bC316 ]a}wݐ/xrcJe>o f9Um}m_}| 6pm."ϥk+˃ .1HeeNyoK^OFG~"bSWUလʜlkwJS5|V.1X'dzg~xUOS3oL17A1WY ďWuBp{zuL:M% dǙ8_oG-tp `o4_Jžp3mtu>0Nh, {%z6D9ڎ=ÆAJDe/a"uS8sx9E\,< aUK[rMT#X6H6xB*˂*y49/9-D(3HGSA z8UF;0hl<{pl 8h>&H=Yhgf9D0jS9!b&0bڕ8P6WRpMW!mBkS:cFa# ƆZ?fc2g`j%f ŮC5\K{%KmK(>zHd8+R٦+~㹦4p8n cZjKw\pzȈϤ 7ͲHSLL'0wpwNot{DiѴGsp|p`{mPa \^#n3|fzv~q~{l$3dMBFǿߟ_.&Y漖Y1qV.ٕQ96oz7c!xHK}xmӕ(茺4.O:: TYKn<ш1=p߀A4Wǒk"ͭͣ d R"22nW!c/D.{6VfwSQQYx.+ 4&wxuv"tci^D!}noeK ~у0/sʒBDw#HX#5W)_WD6$p9v 策a"Lx;ދedzpoc|zn;:4[NҖQst;$/5GL(0Ha.Ղ!a7؀j,Fd!TrYgRD3\2BZvE%# t}w/GJ}9V)sGC$Xv֕[sPQ1|$:f'iG;}7FFooǟaH6p3}jS?%y8EyHmUɅϷEe]䰢6mr#oP\ An6Co)DaÁoSb޵6n$EC/>`nddda70i9+dpUSMKIYN8Cdzu5KxS n2'x,N;1у}Bk"޸3ԓYR_ez"0o27RK;@akXU_bpQ3  %=:FقW/A}H@D'E'V"?T%> L,#Ra8OwJ*wmxlbZ& xP֔ yDr bq8Q{b1$f"4MN|f&72_&|ҁ|Y> `TyxBr&}?_|qۯESyA8mrߏ_hw}=߼ׯ͇|_x oѨ÷߽~?f }i< YY9b2w|of/WYDn,*b PƀeEA$-,a` jrK04W*mvuGGDŨIn'.d9 !3D[N6<ί.ӛ\TOu^W" { jpkt8".]*c?f$ȖOnCƇ6dÿjhZJšc$M0)U>Np"c/ZO_ J}u/z J Fy۳֬ͦgg~q6z{`yCN&qӳuO_n?N1]k-ĒB x ##^S2Hi|[Wrf"0}.%jfseMR89tjRԐU5ey%Ok\)X9A$q{R_֫d/iLu$qa".IL+3hL(,W&oI淔ᖓe9ʉ/?/xIZ$:I/O_/IzW hTT)B LQE* EҚ3t_vVy o|Lq]#[6l"6vrbqۜ*ބ>u'"c,XeJL0NV4:rc- B`FS[AAOvۿ q<()ZȤb BILFI騔V:V2(M(b~,Mbd%p7Ot~_N=scO73j?kokDmy&B[lxƲf1*7]٨䂚 _wvmju/n2@fwEZ tnK/u3mX`6lqڵ.jn}T mĽ}dd-psۢKox^3A3B8^\>!jΓq0][V{§4C>C]˩T:+A~|qey ŻY?R5G6 ,mvgէ?|x }|Bt3ٜ-tZ~Z>IIU6=wT|z/덦gw G=r[Z 5^>٫zXOc3zJ3jChƹtYCc8)BN(O5;_ x\H(o!ȧJ :aZuw_p|0WN yѿj2 4A) 1OۓPjVOby_}ܜDa$5mHqDN/LnA}Өk*,R[Tq+lɨβ@;>xktl'NbB8Ih$gNbtK VpT%+.cҒ-1JH*;PV~3/ >$w83JgaQ幨3;&P֚ڑ&в,t~x\=R! ڟH%NW$_.؎vS"MLim8U98M$("^aL:Gׇo"Ө6L.\DCDU6y2u磲sd^ҟkjN@H6 %1yҴP ΀ A_ί>xuP8ܷJw2AtHA1TP6!f"BJ)T?J 8w%%t$pMgWpWW ]…X#Q4$Q): r:k(>ܭ7+_c^&&~^ Tĕ<ͽӞ*eH: ѡ z>yhG=Ÿ: ;mSgQA'J3e8"?dE\FHH*( 2X 5;ng׷)ɳ2&c*ءጽʍ}( G+%$BcelYD4K S-PA MV/d pΚ+#U!SZ.ii}U')VIvC3s@A]٤<ɇpPm"Gv^`& W1\q`8 L4UFQ57q,#ϼ>:`ɂbPs~`n}7Uw9]GGSt),3˳˳dx^Bу\O}6o//qbuB0kA4d]A$5XNH",Q1GUHI2݊ g*kNQ!J4f8 e]KRb87!n oݠ d:p E d=:LrWv)-ʯ!WZ=G 譝iE E%[u< ]_c],e- hG?`VTO6}rd͠FtP7\ ( _5O\m/qZ%7$$2?ke9yb$JAm u]߁nuԞE-d/.ZllH|_d^>4o,<\ 9f Gy3ssqFv69:?rEyڨ]^vJ>MFóIT Qi]㍉!9H'iMiHA'`O S.5Zcޒ[ nKZjO1Ycyi )ψB&,E jAr\Jh# Wc6:gO#CY;snoKe*M]g8eHsaN֍q!ORk1!CkUnPLUCw1N-_&^AijQ=9;+`=w"P'K$Dǩp&'+g[ GH:/}X[JTDԞ1Xѷ PH"g% 9m.ܺQؘŨ+TC۪9Tfԫ9J5 j!Ps ,޴6 ALk%N10":(9JXpzKΦk"1b,PVe@>Ah5r!`֑hr ywT`R(*DQ,.b uF=F8MBDnJ&ꕧ˿4m:aM9w@ *Qu˚|]sAUy|5]mq,NTkԙ^!F[D,8' 5v!YmP"_""znF#Qiۊu]=hf (-IR*/WvebAlZT@u$:ݦ8 e7լjHk 𡬝qzuCz5.hn 0m-*W1[xnss`ԣ}p] eY峑 MCVz xv~u$-P"ޅ!G3F) mi(uqIe` XtDY`P'< &]6hqg5? YJƟ9ej5(p!*lQDLQR]QxT#st5QRm`Vy/U\$PBhn)Sei*ҐN8hz7+fpՙlAIMm𹠤.CcE8! |hhs1?{֣H ]MR_ZUjun}Q+}*)`?a2)d4M&3}a;G0 KQ*ƚ+`υ-{<5 ѕ7]zoC1 @DLz't.8(E)3IW\_:ϰ~J`?F&BF;x^Qk ]}THkÁ&fϩou(<}AծqAm!b;aiP.yǐ!׬井̏j2+u]++TS_ l[R*2sW-TSsi;SiDF*aQX`XԸH1)9)vVO>3 `KC.}(˒j{w hAE.k 4tzIjkgR[\bIbԙuH4¤!=S)w'ʥYYdX8%k"5-Wg(LtRpp8h8\¨ IʔOCIoF>.3XA0[rӆ&͈)u{d Qiy,'6QMXIFI[&/W%hM.%K3 t&]<&d2z+AdTa8Mdaeӓ}y0VYH_v xHTl.d7_]P<*yV(gItzcCqFӓhO)ò,4E2CϦk:8CWsZoQhn^'ًwwg;P[Tv _{53ER'Jԙ+Ԥ`Y45) YhȨZ rtQ \1x%FȬӅ E HF&4ik)C@P[ Jᗓr>iJt`5D8HI(ēhlz`x^ 8s1v48vo} f.(Jq T`ӍQƜ>'N8 2P-# 8-E~۸Xi[>'A4]/V ۍ* b:BfJRtwBTgj~33oQ1y?hI؜_ ?,慐jthfapfB 'uq0KD-|3JVi|%GdJ+Hy*T xZ,@s %j!u/2[7('aYoΊzlla>Fb&FZ $ 4$8:6=$ٍ$>cJKX,ϬJ\ӄfSg5pN&Uࣜ G"njpX֓\\!ttQtf-MQ &d>ʮr*IAd;+ ͑ќSDA4mK YTkw@ qv$['0,!Ϙ}SQ /kr6m\ ;i/D^!C 8eAJA*3r@e2-{!,Y φ 2w~],zÅU1}Ƥ6ֻ{q^tT[1J/Ϩ"VZ׌6K.6>lDD;GpOb@^֥ked"ĥZ*%)j%YFOi-(B_.&$)Ɉ@J=#gpfH6<'9˘!rJu5љacfi3)I&0+JTOSa06PP R1ĤD ^Ą*-{X3gDq2e.RCFt&H fF\O`&eI`bsV& P)C-T BI.DŽ2~0^1KǢJ,()'ȫ4貶Z1aU]NU폈`/1Lj~z]SN ]L. i 6&aԵ c-Ui;'u|O=h%-s-/ b.yǠ!j@%"gۻ6v=3#۝]ٝF5. L8՘=՟Z{'Xh$tzPc1go;u2*5sYL{,op&1i=pOD!rX֪'i-Wݪ|Kn5T>h^F(H\3z@t+/PήK0jBzωT|Wi ǰ $c+ՒgB ")gB!JԞ[.V8}=[/^{1&X폤4]IZxeG2}89.Pʩb֝mS]o9c9˫^5{Q@lYVd݌zT!)<:|)Ěpq=qЖʁ]'{2 N;NX՗|l|uά_n7'5ڰ #I+zvlXVEFImC=$_f&BsMdBWENAw/aBel?Q!Djh,M*e%b Vܑ PgJ'xIaf `q3zJLj_13\L5Z'Wȁ`=[;cqkfEUvnxe w\1E!W|\  Fz֩feqȉq ~G=kB#B;v[c)xsEoma6ƒ_Z+o soƷ% ^QFdG4ܜhp.MTG{<^nܞ ë[ev!z^ x0%M(㳛_ےEH P"7+ȭqR:<,s|G Kį(1JøI<{txZ!q6%qB$|_@ bk&J۾ sy_0RpL3cBo#cNcE ȫ8-ỹlY-1tITZs)b'3Q&!&G3)L8",{%\@?pgDǩVX,4~R^FS~-"XJy2RH9Itc3~zyMNϓ#f2دmr $)S>aQ&+QG;; ;d>`AD 9mHjҌXR'JfN;bXe&r<;6tN6ި؁? 'iNa%†;fg!y+  G(xm­@tGgdj6_,o?ߍ69e޾1Gh7GCyma0&0~cjJ7..ʅRܸu(c4Sۯ=/@v-v vW,qTJPw,ˤP0GMʔ%IaRj!m88SAʺuUP:-|q"r#`L5PƆ+\S, J50vna,zsM+#Ľ95Nd%wnI#-r Z9?`jrmN^&vc+0Iu_K(HV;(Ȋ&9LzNZN l3{N8q 4&!T\KX3Yb2%JyY?z5@k1]H8෉3?Z%z+QWc5"sg(T$' [Z$˰ci&Hqxj$)bzop+d1r*KeFII R5Fh~6_&+X ?{k-iD:9N+م_mrS$`GUN!^NaUոyWU%xZ-q϶\\ɟ&qEOk~2fB ^QUkDŽ ;bꧽc&la'! sI>C%l^[f0V P lF_Mi&|rK??яw"&r?N|݇xn&tay~mS Ca;HYbM@fט5sf?rGrF$6#$e$6I1ISq;a)0cjp)#qv[]+4MLvl;űns:^f͊3frTxSf!ejoyADR0>r^:sw|v,7`xL!-̆[A#8,Qh35VI şknGDU}07..G"W vt4vص:g˦R+96r{:}s0&Y &nMxuD ;`QV2 _7vʦDWKcUeuxvY{3,FlM'7g ug; ɷ<=@չz?0Xr6F*SNTdFjC5XJ\IK-N +NEjE8ƶ࠱ ^,Pf{9p"\1J\S \9"R"d8װJ0LO\Y +9R)aݡ TPSu]Yw&PĮ46F(9{PD |0+FJd* ˴K3Rnt{נsL v0_5 ƨ `E"{Z>ԗ^cEi/җeWX1F"cuafOϦȉy@.KK\?!͵ L5 ' w rAi]5YVd6$;ubk&8\2y?%0?`lUk ?\ {;%C}6f%k>rnٌ^-q0m6;~3XIۯJxhN?3l@`Ѯic+嫸uqWTZeTÆ1frS[\&S&ttQ" ތY7ޣ@.yǠ!h@{LQ`[@(\1z>^nok hnK!Pc~~kEpKeיRm3R6#JjQȡnZ8'1N_R)ٖg<RQ(.]Y\yq,auIVz a/JzQPMiXlj#QQoH蒖b`UrlR%7m2]tq'Jatd Ȩ^}"e8 et:HwȶX(baS%Z #_kqM`2vw5 b!3Yb6qi`ET* k!{^GyDsJhGD2Τ٧s-T2\]GE8kv jsaYeYiX6S!rKVZJ*H2 b$Hs(7Za Bc,C*9ZfbGΌO\IL;`<ӔXH2 7'23d""Wun10jU>8ct%-2^$\-B ·j<[;&5IԹR@<9^n5e 'ǃ\X-Pd͉ijl)RZqI&\$#n~FPBKLȮbͽUl rКOoަ֛py<3E$@o:eŞ udMk_üRukc*{"|R^ [k%󺟼|PDzo8h5iĚ@6e3؈<t崰Hua2轻#M:գ!u4\$Ry90^ڮs?bfkr{ ~7x g1aPf@d Fxom؇Ȁd[RP@wɋ}yd&cLdZ_ ^lya⩱5iwD!BD3޼?_.WK0)Aj>&! !1FTx8sR$GmT^E1`].PbڤD_?{jl1hǠ}gWO.F5Rʯ,JZ [s7P"+sF,1Q8= C.`ܖ?B2&emk޽u9yp1-C}+~2A |rg@"Bf) BH?5BzcoSmO n'wW//ӹ*:ޙի/fr~=GEIgy˧5z;4)Z.\sU?(& \ [z"yWwW%r5kBm`O׻dc¤ncu+):YPJ@f7OV!VsShE},dWiT &8r1\fTc8og{f:" ;D7VG8UC}SnV5)UP3QַRDC5~[tNoХ\; Oe*Rk) yrVI2훇.X5`8x\.~pV|DlJ-3+_._\ohz%Pc>XF[8~_0!g-;[|~Le7k> [*9o%'^7S>OcGhN8l "#d6-0Q}|>0uc1cxog>~ T75i"tH @Z_S9$pO'RiZZ/ ɸ 43MiQ %=Y#B"@un14!V2t>|I<;d%dok&vRzrU)$S[W,46T-^>iڵRm~?X!| AJ Ԧ8`/ OFJm"J& Ԉw0|]EAϰ;*3@\R*yȍ64*pV7D:7C*뀠9\4n09LFW>QJ4w2 !.pOəD #7BWfn:zB( rg 48a-yD.fawC{D[5HAT'&T'\j9 s]zMkYD;IA˿BZQP$Kog5JNNA"-D4'B\"\5-J?"&ImlyF@.etXTҽ]:\vcU,s0_Ngٝ]!~>I{:InKLL/_&tgrEsg) ßsҚTǏ0Y<]p N)%ƉA .n8xNwFp6?<-2ʾk;aq.2'..-V=sn=<#5g,9tqYuDWwr2v9Z:G9sZNF%K*d 7[yYri,c5KX6kXZ9#%8 ѱ:L0)`{~9ׅ D_j\.)!Pm8MK~&[G#MLԩuzZEIJ*&0* 4`;f6r}NȜ̅L;2r$Ess S#UcQreRy A?{۶}&sCEnirN2XJNҍόdILJCËM"of͚5"L}C%Ǫ7Lg(C9DmZau2j*zGmJa1M%# "PS!96ɁD B[DFqje;s۹92m# M,vShq0*]㼇zAI a@ dA9Szs ʈh~((TR(bP0CUźZRtAYE ;/~yPInI'^xeQ:8Y>(I-pJ d/͡dezϙVPmنKT6d.;diuvھ4aeC߁jw4,/^v޺S)PuZ m9stcAX#;!B00U>VNur,0o'*2VfW-! $cFEdY$8[>~\7#vf GThuTZ&5:9,aٶ;_Bԅ38G qw+v5bZIl3"B,tVj{(|8 t0[tvr:/הҿKt)AH(B zcwC:SZ)WC/zro &Zd)ZQ0ߓ*"oSQ5Ksy-`?M8{IZ_!̋b+F!)鉢T\]x Ds };Iݹ Vbz`I #ni/<QLk,:Cc.b>8m6`ǀ'-0MK,V/}* Wp,]~ФPLSEۋr#/K1|L^v9z|!#7\ĞCޱ! $ۼ^h)AT4M V<ـu!dBzP0tF@];c8@F$֣sS5R^3X~&8(>rd`B%rPl]ۿh"ZS'U cR=Q<'i(b"2D|1Fc/9,~' u[YCܚ$+ e}n?'zg៑~HkdT*BEL(,^KAX$as쨱wqi*y!ķ3܆g&;4īPEICACdmC]g(XXcAk6Cj>U@aCIv"tk~0"!̩OT496~ԿUO7OFc@ʦkfes~ƭ Kʚ  K>)ڕ[GBt&jdXR1=j9Gj mao~iF )t}t:'k"XiTZX5ĤܤQ1Y5F\kpzN38!u!dè/DBg%#Lh\@:M,bIzQ{ר~z X%EKԛUWB(2E{3d ;JK(9ώ vZ H,kmHK*q\>5B l6b H{$Vؓ\"ϗ6^3&!j3uUB xwCά9O$@ӊkAH")zEâMVz&5SHKjk-Ō/+M/F_~ZG($S PQO)<n JǻC@$Ba?|c@B$@D} 3N0>YE Ts⽒7O G,7gDޛ/ <-svPc6]O4ܯfo:xuMf&S<庐ѧeȆb?s.@ wd _BRnjr eFDb.|^_Q,/k3^ڙ!/`P0#z'#Lqgy R`B{,1Ź`îv" ۸o4gi",=]l,{7Oh~_QZBK=Qe8\W]aG|.9m'!ԑ‹4(&\/eާ#C$Žc|K6'?aB*s-Џ񱋜.h]V[XK"ܔAgp+EXQ.́;с\NB.܍F;*4Ui7ڳ"A(c x$'H =8A!MkUURV1O9nrM@/j:A%n gtDB,U~ğ. ܙFKLjw5Vjtv&s)2*mywқ;;o}Z 4d8C`a!جXГ5rF{޲(?}KiT*\䞀m?$>$YXWD*$ EHz:+dii<0i𤐮gXKKZ&/i<~fr<y D\2d/a#™us'_cgJY]/o G+M.՛Xv9zޭn8ČXA8;^Bt9RM2KQAƣWT؈|X?Lo2~fUXeS囆hӮ31ġhxּ'/&`ꬨ0@[འ1,c᥮eRVuP|^ MS`ӓV4PS[$͒Q2V&Z=NB^OCdep9}}Z';zh#)0Wdkpzh -!s֕p yG/Γ H:>\%vLr&DE]c$p,v/_H%5",%8 d:.a~ӷR=U`$BxqcK-/!6N˫SiE6%FüPmI<B0DpH)xV\K+";D-W撃Ł;X7k=MqAkoݻ49uTc'~[%blD33H _ gw]DjqU|Vʤ]' +aV <30+LbX)gnۜrVO=}ZS6]S(d*yˑj`z:oh}u\|ސDO=Lft}62ju-k1Z`Hm5[sc f -%I8d8\,{ $EU.mg24NuL+>y qmypP" O`-Pk:.;E~``Ij0zl]C CߪWu BL͎VH&~IC3&y|WMrjSBxEGM'Hd?h1{HCu5ֺm%|_U@8fT^0MSQYÞм7ݜhjbRV(̊u]%BS22OĿUVZt>,{R|2InGZ-}+rQWRiL,)3>Otuظ6$IJObrLUlQ"uRb>xKv:Ȋt4(~3TT!Dy*ΠBY0%9,HXlwԓDo'nEE$(xn~H=I1,M3znc6o}=1Ghksvv ix? CmD*D$DX1 )%bLH$+a%sd۸5~[6g6rG| f@T~J/Y}NV3ԘaDYND ƈ*1CE"``SˆWU"ȷn5#[Cץ2ȰA1UKYbE@ˍS7VA%y&N^At)2ʻb9mOmH$h<f2'c.=ooV&'7f M6,[G R!HMնuaj$/H{S [CV $ˉ_I`_ө;*M<4?ҷڲj1pH-BOú6 vXꭡ~8zop Ff{6OWh|;Zc~R'=xC@84T(HǑ  ߏdD!PsVi-7M0IoS5{-/5?ڕ)OT )Q a/C;XJL˺=+B=MZ3:7ֻrm TBPKo#Q&^Zip$.%܆¯* [/I|`,ST6 pz{϶dU_2$c][ADPQ9TQ I*KyCO""aBqre$p `@0=smId(V~U@D, N!2v6J{&j\xn1 fLUu1+=7_{fA+b+4ϙ5c=eşE_@&9e TpWQY|=> 6). ]k|QQOj.+]:/kD0D9=hXN%1 /IHI;' sR²0~UˋQT9GMܢAŘ&=&n@Kkȡ\d=6>B/(B _A1TwEmL(+m#ItyD^}xY>v^0Mjݽ$%T,[2_DFFdDFXˬ>J%3d0$1hRDtFi`>(Y)jSAFD}wksNZ-meSg[3\( h Ax$ *E!jNxM%'5Nl!^݂6]DJ*h؆3 <F{Ӟ7uAP4y'Li!fho%͍hP$R6t\InFT06.|^Kb(|zۖXO(UR,Ҋ6:_Lv UȦi4ss2e+sw$]r9͌ dzOŭ(Qw ѝ@Eg{w|o w/zqz}0\$p+F% ELDŨIAJÓ%bQٜ ڀrh*ǩzp}I +A|f#J:OSNuvDh&qf9Ū?2Q{0RHT^S 8n~\rO^Sht.QY冀0)b~]-\L j~+7 3-[c4Fi({9L) -;Y~Qk Ո$Z}gm  ,/*)̖7g yB7_ZklKG-"݊4kF콯9C%7q AkEz"<=kVem܊tBƎZ{WslCS#>nSWn~h*?)od JDcbSq#=w&#b\A \*)1Lim˘g~I(l-LÜ9QBu?VBvIr!Ӕ!ӣu- o'?g7{Mg;3:Y*Vƾh0yC01JO sQ%L\Br䫼U<⟖3eИƖhpT%R%:WZ=ʎҨi c-X!Dd)`8Dj(0/R3D).M#* bUQ!NȌH)RФ 4EZ8Q5DF/DRD@RZ hz炎IAxf ";ZТz2hBnJ hypK5Pj 1~#F8<0̪qJfh1%Ϫ'=wyg[7 fh},pVkE2̽?wuv饋~-Wa4ԟܠ㺕Q`EEU@GBT3>L&wI9_KeA2'2Xt}1|V{=ċ_O#' ov}N'݂a܀jt((X ]$k45I8u2p@\Uvp٥XTU3+)~4&ЦpV)|hJ*hS4\n3=C 5to242^<[DJ%*1(i5:hA#$\8ŏI$e)GyWG M˭wh͆YD!ٶ:g!ELѨQ{GtB%f/S6w'(})YBxvM&'N8Up!2%%*4:IDFԤ&(GFWI%ƒA±DHR(&d3 `SBA"12{5N>֨B 25V<"H%Η5h|L߸^BKfh[vRwQD+++kܺ~yE2%hsyǥ֒WOK 4JgYMB޳bŹ*m-ZM0`Gw8)(ʎjm hz5+qh'(Cx;|WKJN,YIu`S#a*}5k;EҞ$^Pq:Z<`葵x>CQ+#?LSó$7ojx^IڳH֞徫%=g(|iߛS,/ yD6Ps?|&p mJ(]]5{חT ف(&~'׼X) ٦lTfBژbyu䨥FPʹ↯a0\sՕ'׼#6 F]({}~n3yP "hHenϧ2^]N!5yN ^4)VE05gr/:WM)X6 XEG~ؽQhj/Mz!t(ZmFnWDj܋JX"LH#ѳp3WocE)&e&])UU>\z(iKz t+kVN!Lnvow?Vc#>մCcn̽`{z6N%l[`]ƙy/ QWZs~MS 7F{2wÍys[%H'\-r6tCgCl,4'&i,C'J S:jB3>F*2W ީX%sRP^RaAT HY~=眏`)ҢX BIe$:iw+y?M!Wq<;eMpaKrI% 0t ؛n_u|=Nj"tmb%}tI~@<&Dzzꫥ[AOL-$[iV *l@odmt]hU,9_k5U@}mJny AZQJH*ᖔv :󽐏V*Fmq5O[G7AVz 9Dn k-9_k[Qa~ʰl!^jKtl"{@5\|WL}7= 8 r6Gٽss/ Rxhyndc{`FgpI4}|#fKq$% =bM8 GքiBQy0J䶺&|g{[GdE&geăi݇ifЭ ^gecgmqSV`ipic)%-}%W3pW$+,_ GcM Ns쉒YIu1XJORXB׏?lwty>`'8fd=> ;^l/ xxņMdm襮Trՠc>iFLL2/}MHq3B[%[%;neFWVŞQv:.S"4J (J '),a Ĥ% Pvnc@r`aƟMSljcDc ()P ›LOAq%O& } ;^{ᅋz`%kU+•^_KdZέ~0h(G({'Sci󴧰P=rgAOf؎b 4B H ~{ Agv`{D^2o> ۓ٠X-tp-jE\l:]@mWRa[_I P-\0n]us;`AQmI%L4O0Xkn[Q2}j`hcS=ehM! fШgQP )q6M6S!%sc2n2G/:WXxӅuC. ¾]2q~xȜ5! zl0`ӛh? )u3el#vtmGݎ=c}18[$noL^-Lh]Gw~d\p߽s C WHgWqkٻ_ޜ}?]=N9Wtqzdjڿ\ R^WrйW~gXt3hAK)ۿ^)܎ɘw޼ɇ;<ת-YkV-'l ikm^*mm%\xC%os*8q%Y6B3ghV=ȝR8&PE^({Z/eճ|O~k|[wa4Ϡ>W"5Yν{ Eg4-m| 89\|~5ml'?Ϣ|A~޴F+!0'yC>\٬DyAҜgdB^՟ xサzJ,Aw4y{tݽsܡwi{h5ޝP<=Їb$sG_Zgc:]s!7J-zN?Ä |؆ -P˹T^YI-9ӣ?ݍ/7~A™hDZ\wz~(̔hL?hɀzQv{%Hu뗓o?g`~L_|wɻ+''o1~6sq}uoʭnuԫ3|T}Lw^Mwp̾ f]Ht/.\\=d fD<F,jK9N./OݧF ?;Փqks^F\-Q7ﷱlO*:I ;<3Zݬq#S#X=XpT<$BU8)7}5Y]0baqw<[K I&aٹ2ffԟ/;4Q0;P,Ӆy^_iC[ Ӯ71%t첝}#y\0vuFr9ra̷'DK{wN@&r. 1o֞V_)s˽L/rW\>X>ߓ1ݒzKaƙ7P'ne/qޭV5V@[wՄ4[kt~("- G1[%VzIJJl9nn[)m>aŵV:J}]a*|̧=.f(Msk Q7ΦI1)hBlx(iVb㌚5IRb,,/q..'`9(cl_ޯ+Jn^k.a⥾{RfUcNb9iRR{h .P+_:3NՆ*eƩY`h 17Rd1M)4 v5[3I~(g!Bb/zc1(X Xb- BN Ttap0FҀL #[%#)JW`2Bk"]dU1w%O2V:yF,#;m!"^RRwÚatZ0x9Y\GRew3{~C`c+Rٛb($z',81BD ˀ E :, Qc#'Ð-SCI"+̬pTc%G;2$`% J _q QP (^)Pk6Pe-DPc5X!JƎ΁>@m*6:RTQS IH`8 28P-s˧`/h\tTZ+IE|ssANؤ0a`$Fx/C݁RAAc*Uziciaդ\4XSce*PΌ1S>%@7ȽJ` VfӘ1KHVXIc,C4:_I ;"$Y0u%9*d\Ǖ#T{B1;`|hRC`'nkA|rgavE~@yAϛ~|AϤ"_̲ᾌTxz郇p8򓐿 Q6sjl(D V'} Ѩ:6#Ub ա%*r=s:"dE< ݪ0M??֎ا[]s.d}1hœ )ZXS_@*<Ō!*+x$0/bpk|r4Ɍp\B%a*ǛS5^Pi5 ƗFE(?&Yzb-$,ڰ18$ZÒS'Eh Ne*˜"PX+搞__+=ml=#cJ7skx:fNۊ.1oկ}"9$k^N) nۋڠįqK<ķp\j)?cgP珦]},`,yn9 wg{̃MR;C)ݚˋȮ`5- gp\[Ϗ/b=q Q`kkX^63q,T|982^5’ź'O8Ҵw9F5u3FOwf^lrb+f6]d$ 1ag= nZ+2iVUAH￞0(|qO+;Ks2%cݚ4\}`g]H-.n*l{g<Ji!`=\حi\a7kw@`ts:IO. Tdy N/\ dE23 ʫRX6CYsv ~Q27̹_OpK5kR[H2}횐X6Êo8wͨ[M*eHL]+35/=NTiRLɨ+g ULsW2e\p7VzZ +fZ HI7jx7@y]w/+!'QI "z'[5661XR$2U0@ ';(= bepЄEk;iP*zsE*d\Y-J]"^BiWCgz0Y^~ DPYfհ\f.(bRIjqx&0>#=HRi;R0" ChVyGnY_ . W,jL9jJ 6)^)Lz4OhXjGj qT[b ICS=Q:QƤSYEx}>$ cvd+c*cWad+8jfb\8Vf=>TkISةTq1X1LE~v4M1 tfTy4qBSL(Q/6)"@"H#a}*#r+dX ~gJIRHdm~(͗gGQwҬ3{U>a I-PW`?|ZyrRl_g=08v;%`tH&F'{޵5#뿢Yڸ_뇜$3Ij2S*mUl#ə˩=4(Y$RŋgRiQ@Fh_J_ 0L)Rwfni_$yBo/0Ƒ"t V^C7X#̀D/S}QEV@hSN/,E~4eF˕nY&"a L *Ƞ\0Ad2o髫L & Hu\[((`dS8Xڇ(zMSXce`azq)''9Œm­81 iLNӘ"1^EdtX]c ioM 4 #5ࡰpdooX˗d|>?ML:>қ[0m;OXb.BmA>QgU'Bw4)"@ 6]F3rs3"@Q"`DϪ*[5-G6f} d:{~g+__/8\u}˗^p{cUC֏[d)Br.̹F1uj󴦱+vo Gkί7`?wԋ]lMfdBݹxbS]=wBDj^ЏDH<}穠tN#_i/HCD \kN ⠛T#'Lml2MjhO h֛I"u0W騚V@jՙ9qǖXAN5S3 %(#DDĒmk%<8QLUG'kdI`(CːkCfrw`kbEB"$:ATw.a&!%Ҍi a`8e<ca)r ]E:egUQ$U [wnJ2)=1;cy]F~7Q~ʷ c`>YB8J'*ޔRHMpV~ݎjehJ ~H|J[\]E-Hj{ MjBs@̺"1~S(% 췤Ǥ!<9]W'*62TI}yx2\f}sim0ϡD#*\1]0a)M.[\!<S\b˿\t":Q3\=qaA~ qs`B(^&@!bQ܋`0)i1vg-xrY /Zi0/;_P ,)E- ]QrԋD\6Js%o⼕J-+*awy(!\ثJ otݭ_A KuE EHI/L>^i]5 \3<5L~spŋaB>*ܹ~VȊDIcd@;. s]xg+!Ψ(7'B[oܰ_.o?@S06Z$}ޛo~"įRI&ׂhJsc,RCYNr )h0!5@F?Cj*):]ʉ c,I%T԰:VXp$>[Y>1v ؀Rt+ubʼn Rz)),ZYjWhLJ9"HUvTfWQ-\Y2/|*,| $rTF EFU]84*hJ[RiAp l"X1. $F[b`JŹ(j8%%fcA v4W$OԂ4jHgWbY^2o8Dz/ݳYFuWq]:\!_۪8 ^VPiUP-=XmYW# 6L2:c;`F :jwkNTwr~y8~D;mY1[L4jΙ drضv{I_ dSq8gaFB6\ rM4N\l FHZ*&M~؏ Q08MMltFLD^RRlPꈟb&nYuFā9`Yjsr_5Yt%&*)/\`jAj w\xM/5l]"%R}TKvһin:07Fo$j9y}d1U*#ODPHg, \=ol737_'',O.Y2e+dNK\^bn2;x}Ɉ,[=›5U>&nyӡQه({pCU=OwΡ1iUSYiLpjbXʲ;g`1عFYbhs_ʭp}Y(՜,v3|3g^xljkk2g ^ĔF^z&2'H !\>mtDBp܅GXmܢ,IJE]\$.>nѽV6]OӞTARS$Z~ 7@.ϐZ-$㉋ַ"7-`ۍx2̄=>"U-~czʥJY:iԺeL'fv2NGPjy4ƪ68-&y:tuN%%-e9p?aEmrI8M8Fm#jמAGW!/-XlxQwj~lS+l5eY[Dq{ kcrc4t.!0帘V@C,TQ'VIY'IT":I::.|s}Np9z?˷;7]8G_˖Y,!+)λ֨L5p`N`]O(Fu4Cmxv*DH"TwVL#)Ɏr-;OB){f?k 4AvF8ԚLpAf艪 j>z"NGOf&}!vC*8F8řV1Q VЬnE(C"hux!VG)ՏS iTKtv5L$ᖶT֖J5BtP^HJ G9bdR'xMt?>Sc&_/ԲaWknƬj bB GhwPй K~#G3W(]=-1gK b`n.|jp.*qArr[a+#u~Y7i|Bp.9nzx|OOxg_ wVn7]5ftkԼ1g%-#?~S7}tE0\&a~ KOᶏwCk=~0.F: nq;T0 Afv8* Fv8`Ro&ixUܳFVxx79d֝*tZn-07ْ۲X(VkuVkuVkZU\!na|JǴteG<Գrk+9O:)=TKB[6￴oQ:sIA:/mеN59ǝy<\t˞IVi/'սjl k0m]~xf2m2yq&+U v㘡6nWar$,u&1|;:rv#'e*://{'.0 agIAG<Ş{ Dpu0< F! W[Wc"_sgw#ڴֱ6ԜMW@4oLZkz@$[QȗV+ý.x9D(4)Tiƴ*ԑfh[m9o%ɏQ߇OхH F+f 5hi Lsg %|NGne-7w>FA[uPR0N+b"M@X3@ךBbٽ~F (RWX+έPfv qp?߆أSx]×QPڠlb'lή?jeFT1G') %IA"F ,zc'Ѫ9竳|e{ تH؅8!U 6p`19xv ;$!6X{âJ5ܟlt҂lfd7C*j B] ?A԰:mk !hLk#xN:S"PN :𣍗F#,%}9½Sh{qL ΔTQuR24P ̂5 ĨAlc͈1*rI4bx uSmml?xLx(iFᔐjpz&{G& Ĵ)ھxa16]ZenAGdc*.Ft8Ej^Wo*g)/9p>WSTp=LdA싙Ӓ'pw ;ʯgZ:(G?GI-NҖU~|()|*VmQj&g=~ ?Q{ %nifTa? =5h$(i#4_(.j!krH߮OHTF50+ %'X8/ZARvEa%k(+< >} &kic(>lS3|SFnX3ŀ!LrLJksK!9K*XND7$0~` %˅ X.dQw4bBF|`;ӎU][Oݾ#1]pshI:?wf?C;E0N<}Fij=ηh_K>Lդ*^ہZ]v&5fΆdwv]LuyݠI>pc|mOa4ɏew&3HdZ[З"f6Ō12EaB)&eޱHRUdȡRjn$&w,. Z}.ٻD>[O~WCp蝑fq[Yv_.f: Gbp~Yܪ W-rYɪ`KP #+C~O,Uai]mP?pUKpTѣ)_0VCJ޴$\"i P='2??5J,ϢGKG߆d\hߗ[$1̄<QfQ9 VhIW;'}O&9eQ.|&+‚(7[;4#mkp{L^M]oz+)jy&p~l:5N8+DKگ\$H 0D|geLa+<K`.QTcƕ'mx'9Z+ %{;:rp{)D7γ'l<{"D>B /} fwd/\,^$qO3%iփF:'VPټ@M֭bQ QLkY59wh;:dy5VyEU@ɷ!UfTZsxA\IhEiW i͉c##-SWP6+Oto~vw>b?O%RyF˵UFDj#J-6!r"[#F$ R+u 2I%R}9ttxQ ӁTssu{t*n J6tg^,Bvj Nͭ]d~Ad ,ߟ^BIR[l}W(LYњ\WN.nOk6G@L.Rie`1hf 9"H?d:J)]{~QE+5}~${X" 凟 Hʯ* +O )O >['bF䃇I<=Nj팸/`%'4]cɑF- W]!"Zi7'x jQHi{U^%-te4'+PEݯanjۉC0a-@ԍ-UUbKO;VNEl+&Zf)}F!&d,dkE x+V%RY`U.aztVպ] <ѮJ;UX`<)u tVEijPX F!yk)%Ni8:rtD>kCZ6{ɦ/n՜\Y8PAeܕRF`^~˟]Ք r}KLq Ti,,@>(,#itAH"Rk% #TP8F1HXS(/1aFZ3"(sZsdiꧼhoi)]_7,b+D̞p@x!Sa@QL1WP,˳L28Lx!66Us=&j+'% uMʔ] NX >(zf}z5$@J3@Bi!I2!9%vBxYn$a)# "IfB rJEκ_I:%iьh3^$p6#v|} UJRoqNR× fŇ!.Ibx K #N([.Dy:!L4|JKs!BKTcJ-AqۑH~" (hz?Fy1a4z=(/Cuo%Ln8p 0 fσm73A'Yq;fa\\ _IgN>Mݞuy[;9v"q%~^6m*١ږ_vh"qgto;HwoPlee ׅ:]aյlSr8jM ]rQr\px8o-Jdlg 5uьz|)O_g\/#G :[ea m+C<1a$y0יgY F1 ^ՖE )"f3, v|b%nN11kc*^ByA\D7p?YUO" =ɚY c-@T& t2uj__.Ⴟ۽=eC) Ld\ZQ \xDs$҃_@9nm/G3L Y@12E@RYJ(DExЁ '3YT:I9GFGs`#\ɾIOj"X@J{ j28d%U&&VFgq⣏հ8NXe"*qToU1)ͅR z9)d[Uc: P# HG+s 9m/ٗ/Dc@if.14PLrUD¼T4 ^͑Bj)' J$ `SEaλ C-r2/0`XQ0F3̊Wdf-%?9kr?{ܶ O'!5KvR[g'yr5)ΩRH"@\*J\.Y=}uIvZ(BJKxYЄAb58\xb|J$HS6PBF>TXrTZ}%_?,˺&F }ccGMijzVjՕWxכ[)YJ`W3o2ᄏ~@4bL"Fޖ ?dXn8v5Nf/)}~gj'wn7`H`m# ZۃO/ XVi8K;aDj~,PR9jn=9~xWkdՠZ+vvD@jb3N!RL7a8)P`T"ȂvsJ%<)"sP <ؚY:Nl;0+ 'AL ~lҠ69s!B 4x^ >%F-&?V  1FuLAcp“cbp ZT: Bl*,Dj] 4рP1KȬ :$8(:̑hz䰪hlG ~ƒVy$@/tvm89& `"y:y ,4:bP%|u5k30ܔL$M Rdp_t1 1#cyfiD9,=[snE!7RߪqUe}<:V+ KuLK >\T-y7M))RC(punL,}mv>B! .vJ 1|Y ^Y;WOz@$#.$jtПig%Na?>5ؘv}'sw`!@$𭑷BKZ73]͠Zos=8GX9+?]3d@.xF!W5HգιyqOlXPR)7,#ҒtVm%G@@>m4x诃`?y3uyYAh~;Yoc7Br3s6ۧtu==\n?L/f+kqa6GO7 L?_SceBN\9rhx˚e⺖XxZUȺ;c]哤\)m(U}xȔ$S?ݹ}a6y4ٕl2 p[bPQBE…b$^M8CV5ՃUbKZ=ưlcFw­) )j~DpxXOIX_‹ɄI%}^\ohA%w\ F[L_KYk.!s}D*ԯ-U,^R6O[ëa&@-RwbH6vw~{r:aim]|MR nM}2)E3֠Z#|,:e<?S?k/:g$^vd-g`)(HRI}3Geۃ! S}V)_.G_&kuZ>V%^R2-Ҋ˵/?C_āOSp/wrvk]}m '˛?_e9&_}ƚBś[HLD^[>L$נHy=2rᄁ)OFoP_ϵE)2WCZ2%UK V2SU2unMH_\DO)*5/OxnS^a}Ő}rWR*tK\yUg-19YT!#3C%Ч%@Ǻuuf՚y~ު>nK=KG_У=Dd˟8<2+-F9N kgk[ 0Q94 s[ {su*h\<ܪ1A U89,s. s|$7Q:zc@I`t$1B6Eg/˩7q'tu7ݤ?G7bƊgw3NWK<{xsJip:< Tfw5 <$A3gN4#q;D rcLCBDcGlBu\zdbRZ)9w01- wܚ84)1 jGEGYl)gZd{k&>PRIZ@(1^) 3T=Q[vXJ4Kvi4sWۼDkP&\P~6EM2 xLT+A*&Ysn 4f ؾВC0q*Ԝݚ#U3B@2T>v`Ci$>SicNZ̰R9 oF+g B-|2RG&'6J vy_PMn9OQ_:5V5'ww~a9o ersA߄R ?nC>p-/ .UåʊJ-=BH !IsՈ;=,yw5ߦ^w6-3I~E\wL;7+y{ aFAp ?2FpO AAǒ̭У8:mNJbﮮ T$2k`fnTMpJ2b *`Db%B0h_m^/(3yq=JnUGHiqSKkUb6QUc ls6EU^k9ݚ-@Vp|PBPj@@@M,r5 Ql*$KT $L4xT7"bIVxcQ$w{4I:ND_/r[ l 炠lF n0׶`vR j/?kAUX%TX=j/}xIqB2:a;(L0kRg NF_ͧiɺ2r0ʹ.pgv]~8ݰٌI 9VB8J{KASKhR?VVKj甋[ɹWl{)՝MMd@.xF!u7N+pf Gz3E9t*ܒ2c:;sqg$Մ*=||}u\'9J;eTŨE.=<LD8P(s퀄'4:W|SD!UB`$9@Tm:I ]͞<1lo|dA`kP7d#vg?\ӹ>c#G)Cr#s&33Y#yI5)!.DH+e:3t\1R|bۯ ڟr,R',(ls@!'4es&XI΢Z\*vb[5AT!DZ|=5ďf5]M4פpijI d1Fm](iFє8(2 >4q]A|՜!9gAZllvȆ@=czv)'v)%̩ElE![ [*5'x rlGfd EcUÆMbg1$vg^ud@u2[@*vYg^;K3\6nv^vf|gyנN谿m $&\JSMs.rN48/RQXPfG#BcXrYŕ.sܦ(B{V CE<QqZs)srpytR RS96HgM~!Ǐbv]X=#4]jDJ*C_wO-|Q~(PSo24h^5UiX]2:llPφ=|D-KS}bfU-x؅ZPnS2DxߧRPJ?Oɡt9ۼ'f4W϶lbriOΆCM׍5"EgD3Y3'z%cPv_U;H`"póK=6fy[cgdN s.X'vyy^rTRhoNjW-&FaQ.Y,9&D+$SNAm2  *SK LP9%}k4LR ndv5Ҏ2%(ثݹ5: S\x%[Qp@8(MJ%-,KL7E'wp*Rq(AK2LVFZ7cJGiI= 1EéuT*~= #dn?.[;gKv~mЩ1բ哸ЭD8,EU,\JD1ϳ0KALD3.҅09~˞|r  Vf:]@?n}:L-tIp#@@m5zguS~ ֫lޮw+~ $nـQԝc4ƒ)Wk||?n&2?{۶E C@>:i E>"ʒ*y4~fIY^)}|%fvvfwvfz:3[|a'nV>=8NBn{cu zk>R,XÑb1tFم2@(H'r & +(JKqR %}B2R>t7 ^ ױԎpf\f|ӤY9~YWL&alh}%hN NΈ"T%rajAx9 CKzæI:dt3Nٷ.=09&:s 5g-1#f891xV} d& s_L'o[ܵ)N }[|ۚ:i׹m,k,5v ]ƮM},X9}c`/9XF`J)G`K!-(sj`-STZn(ks y&%]N JrT{Ҙ02/SY FjL#̍eoHʰzC1~$f0RY 5LoYf~q&!bsdl‡$ȓu'`)&#heU23.drBF ?]1/}pE6h?|*yJ*rv`Jn~ e}F[97 >8N>Nk10q+Dbmk v?OӧaP/OM=O/(s|Ɠ?i_.^Ë~zziZ$;'q{Jxb{0{_ {IUW{l,<{-?pndxOyx5 ?8M]w{y<̜쟛8MnsIF5l0 =+EmqMg EJL>~4EՂz4Z{SiML\חip Aܿ"t{=5EAׯcd~̧WSwt{Ϲnlk~˛$}.-{d4?)4xdtd$<‹? eEyb5R}G %J[v!Vfy|5=[UE*KܕԡI.43.2fS.ǟOVOM<`>4~ $Mʤ})>ߟt<~:M9o3 oo TsythPx1&q.Y͗;ݝ}O72}9N"iX#y@ƚ9X,Ց *0l!pNV+0XO`qR!b9VXL 1+m߭nc p_)|KF$c)cʵ1Uq$Jǎ)"“`#h"Cj7Gu=S͙{RSG%c\},5s8^&$ TTZa=}b9%P$_1 2UV9>luGvYXN˅pAXN\4~K/Tm# 7nnfn5w9v&Lj,xn6Cߝu^ [=|K[xro8J\iW=SֱSƛfO%e+Vn;8Vڿy jfqo7qGѥ8Zm }+/7z1!Q7 9b,bYL4kb«6ShBoT8P(&RA R i K oMm(lEm /mb%6IBu gUB P|ijeВT=,82lRG?T)N q,|'&`v)e CAUuӮZYܙZ0FM4J+pFT{M,\boPJ@Ќ Uw+؟jM%}w>`$O$c#]lD)Î%^{ūH(@Ip3s 2wt0^J]r>l`J.n L)YE,НUcq8!n#* qQyd\6jg;D=G/K9;v=n̡r&X{\ZWj _ty[GQm"R=SGMM0NyAÉ;˽#;*X," ~U<683΢B\M)@6 9?t K9JfttqY4vQ}TӦi$P"cLuRz'x)"}`QY,yHX?ܱ Iߧ&9 ENO%K_d4:Zs}agc0X'À.|IKQ͝/vyC+F9{`4!Y&6 !4+`pRE66tf&,2w8u&4i8Kcva՝6as`{Hu>g#`FA kt#v(%*ڮKq @EJڄBP%BKS]SUQgTrtrǽ{ T\nsYٜ.HnAeK*0k_B޷yJj lBw020_@!w[߮ .Va?6C+wfVvu3(-o8yKm\yMBPoDm%ʻWF6D[Hcؖ(ZTE 14^l vkybgnV,mf.(nV=u*e,P뚶L2V2z1Ua*.gb+W2_jc`Pe~̍Eg-!h~};9kgwɨ| RA5jӘpT2,TLvpRdZͥKpK $erek7oky#jHzХƜ^eЋt* ( fم&{9^ bBwI-flۭްW}ojg+7dk1!Y tu<*eWi^JHkX=T_ > #/n:/C'J4zE\I"D,Q6̣2Cpl$|9IZD;PR|0/DF Y/璕~șq 3&'F|dfvŖ,n-/5;Ȏ}"Y,V6*kF;=Э&=һΥ Yd,/[z ST<i $j)s a*k3R;# ( 0e:!eiG5-2ʏ s(JFgE o˺)D̼~* 0߄P)E8NF|40m`>gJ?vwxM49ʢҧZI5zbH*/brSQ3 ;zw{l9Ӱe$2S\Xg `;JrєH*QGxDSlW94Y[<3BLeǰU:Xc-rሃ %RM$nyXHBF*uʻ3_`Ga~_,`bvyy%wxZ8/J~o\`rM.0+Rp XX do9)sn'b*"Mtwmշʁaqٞ:Hgf>[(JŤR3*PNh)cX?^p]cbxZ [N>N0f+QK/,ȅм,amiDϹGH EʮC,HNFZ4"*}e89}E/Ԑ템_ߛB ,HʚyP"X|Sm2H%!dlHoG!CR)=tuJ-8:;t.ggSOYI>Umw[/gJޤrtJ!\EtJcƺI+bZX BT'm[~ͩfYq-r=X0Lĩ@6VRА MDj%Nv4gu0RUU򄄼EGK 'n񱕺刹 G.6/ǹ<^9 +P%dTK Lq F1;T>goN)E!NzsMM*c>N.7 (JZ3th'?g2@kyg !8AvHG|1`;NF97i;{q mݾ9G,])y7wo <|FYLЈ6#B$F`?~A4vZr)8C0E4JV(ՕM%x"9.hY;,xV dc ⠅V$//+:\svz,ߟ1ɆpcG8[ ,Ś;˰DPcET\a@ 0Ɋ3L$j#ڥ`\/|=ib{wx31eOہ 1,kf#}#Lj(ien+yRLFAZ'\A@ f A|^DZИodK %TRTC,*͏ ^26-|7dǁsW aG,d>}O @[ۡ+8wJ\QPavA1V;7]Re6wr;4ù!B =E-+3P]{U&㺿ćۗċ{g^DU1+R:0)9mc1rNYF˴asDY /7~Ez|K;b2m2~:Lc))AʼO*&Ln;{&aXXmEC>)Zm@[ 2GM`OLF"HƶB>\0m?c"8cgcs)@Qq 5pԘoC!Ә䅯{صu!+;)%*ލӾzܺCCht}6p95ǪE=Cm9G6HWix!@WrtR`BWcRG2mn V1Vru0HA#ØZ%3&(UݙŔq/L6F$S.@VL+0F@aL9m%Ҭ;+!]֨;& -Ofnxx!Qdry<pd].IGh62zV=qzމZz8V3HCUV tP;T-MԸUIF`uU4Fqۺ <ʤN;X=iZUpR>^z b߳-㮅ui֊!{ msHs$ a1WON`(aTQ3 )U4z-:hL'IDZUD `/s=W @}ťtLlm4Vh7r8M8ZFA" 8ܢSؾJ:{.̪؄#\G+A R|rw"B:DƫbT w pmiwO6&%` H[?ݺw24t_- NWq+ B9Hs1f -TasF!i5ϱ@e*i,+ ZjHh/s8W6{&{w oˁ!rKI mU.`XjW_ϟUMG?]t3g1Ex-BgO&z6\O77w.S<-Nz[~&?~%X?<^ټ.iH9 υICXvQ0oh=X]٭.'X%ҙKp{rCH0oyx`*Mh?|4ЄgWt˛~x.R#4!{$lnչu1y?Z[dž +9]Ѳʢ3wD"tJ(.h5Nm"Xe=?: ^Y 0*b~?RerVƚc1P :DBY{2NHe4hWb\3#{_=@b#{ϕݔO3;ϟJj흻,'^~mvʣ}<\qÆ:ؼ>62Q]yԮ^wcwGu΃3iȽϽvY}.>i&Bf;uYt;MMȾ˔c_}ev_q McXiF2Zi戜fR_bUzHdSy\t>zpjhQ=2t@G75i6!em@Rh_O8arp(D5&1C>6KALrDhJ#b3zF˴ bN֌}껴Y^+v0 ]rڪi_^C?mCˎ8Cgw6NJi/ݞޫ F}>J؁H%Fd}72lY,R^4)Sw=Y3C"28ix7= V*I8ƀxr?O©5=ɷ'T2U9^>8H͙l<(܈huvV ѷ[E ;h1v0FH2#H)fvF:)ɭVsEP.U޺\S8]ZXPti!-4Kそ9bK2Yæ|vbL T9cЄ YhKwD |Tr[9f6JnVRHm9>eNeSM3Xh F=̑)CyA-%fӈ AF#4BXfhAPQL&̺BC2}E^Y3*@%LjLr. %i LiRʩE25QhArYd,`lБPUny o3R;# E 2`jZdʿ0G阱5?*A2bF䤄$9-ȥ -11ȁ!dB.'Ɛθp0Yƥyf% 䮀͔(b MA6Y"HYgBV}ƅ| {|T+hwq)Gs; 9CzDTPǍ[NHw?3[1ޤDQXrY(V }v j$uLI1:|n!֘|2wsc;W|؆-UKt(֯-yݛ)H[@˞;$/хP2ñDpSe*TPe* C3"VWr#w"0x2ЙWA$]Y~=\, tUdS0$QCeܒU&(r([7`Jݵ)mwA6]1|_4SU0Lڗw'&=r^ $_fA24R*R~LIw3"l<}K7(:tbkM&c UB#M)еr4-m#$/b9zp)5ӕљmetUIt2 6EJ)cjn(1\H u W6gH 2@I1֚aoާ-ҜXqAp$D/WtA7cD*z}؉פC/JA] we-1 &[8B)+ I_xx)MкbjK5[jػ6dW vGŀl'X=HIξWkҒx~IZÞ2LW_TW%XEW1Ũu:hԺeTBRnt .5A=9ZvReٺ1wW 8g2Tp\ Fj_Ts9O&^e_/U$@l$(Y+EH0Xl$mr]Etz5iH篒veJrɿgQ8"6Ċy%*XB"C8RXckF\c*9z1} n#ÆlY8Y1ile v䌌G>)rfS52"NiNr?&M9k3^"s,/'FH {9+MhܗHԈh2|Dh F"IʜԄ#˙3ӈV'QQ*k(Tt+r5]٤)U_%-LbS;Wc$y#s\ŌSFWZ7Gz1IN0j)%IV Йx9ڠ)8pКﵣ+.z<%s5"!K,/"}@QBK Zc8x*Ԭi_#Bs47Upw~33w=x6ϻ/߅Wh0Wi$Ѕ+a6̶R__ONާW'ӫW';{[YM4|W` <*E EΨ;Ml VhhJ6ss/ ϱ'4 _?߽ZN?V\ūH~X~^D(Ԯ`Go*v%% h%cDh!N^I˭!2J"fSv& `y81 <4A%I b*HNdJȝ2B0l $ZDs?U3Dc3* AviNJ"rz'[6K]r58ԙ :LZTU]e>-!vQjO=ZꞧN;a zmtSLRWʪзFmb(ڸb7.oe<~{;ȗ]?6O'X}lP+; -8bqO`_mٛkȏv.]p ]M%Ι(]S6Hp9ita8^| 7a׌ U+ԑ ۘ[ޕY4JeTw[`D@I4$Ly)/yjS9YLN3,&%5%5'U7szYL[qĐ XhYLQ—uBNX`|o .df+c\ AÞaX~3.}o֫!_rؤ~g_"Ԯ䙃Э®~y*Zcߦ##~y7,"E-<ZCLq(Wk&.=hBZQ0RY ڙftk@]^ tsYM_Lwc!N!$[qe!%g!]̺@) /jrIJ0ϜXFVA '{:R$AJϸsZF#Xlr״OB ,>^]fw~ &"OOk3-7mUO-ơo>[Ք@$JN"+F,!k)U/P!A5457tT( ▮b.&,L6 -qyɵP(X. !7 _r`l02$ޏK%DvYR"%}ntÚGx7$ 42zI6K34}|4@ r)@* Am 8`5Jm=6hU]CzfPiu:POSm7?w"#u+Q[F滹e0I^kg9j~5_p{3-gތ<[ T>%e*)SĜC][s7+, )/CRRgvYq%[`lDr:4f(qHΈBY٬cQAЍF杧 \{wLۑij^RrqgPo3}!想SӖn[ӖNםm>I&)YultLVlF:gqm9ڊ'r(ĿGPS͐<^N69f2K=M#`qim9Mth}bFݓt2χT Qh>iI\:J.gT1k.K3iUQmJi<];Wҧ<#徤hC9Ͱ5⫹ZG75̽KћIz{'in<>0lT0YIU(0׷8"W0>|2`WtHCv"VPhJQ4PW*(* @Qóh{7%ת7}5mlE*RoРK˭@t)_!]t aVq+uuQ}YD)Ctesބ ƶh,hơ6iX ymddKte+a]nu#(˅3whBkn@{W<&rq;Yvk,:#]kGIy|UXt&>ďy|&fn*'6"AsOXp80^yWg>b3m1XtDtu;M3J0idOxpgBG,TqK\Z^ii|9WIƚteudªN/pK/s?DUM/4iXUSկ3P{bA#b^W :luuniDŴ#*^' !K!ՂxX(r쀚8#BIɀ&l^/s`hl`ia9euz %UBtk.S X4i& ,f1eEe ID">ؽyoMf k'r3)L0ʢdS ټD"Sqf2Ïa`<"KO^[. ڞƎy<͠ɘ0[ijGK#8 {3t:6t:nwI,Nּ۞!>aXnQFGqDT1bN݉""HA8?pbbLRal |/fx;ɝNtz.w&>xfl!zfiX6=[rbp;()31x&w#$ G,&)8?1BqXnj**EDK `\peQJE8ɨ*!Vp4H7靹؎@E(偀*l鈊,N"؊ _AEIP4-ܗRAkKɤ&3lH!T'@_*HEFjLztXϳy/(&t`eP]TLb/G8{ b(ûڛ^~AK\2deoG5HCf,ȸl^ZcK(hF%HB \Ju!p0ݔ_quFB܌2uŖ0[]^֬0DKUǒVo"%ޠHg"F`J8X=FH:Tx$Sabh+$%|%ءrTLj̥ID gf3p M2fչ% +c-xsƎߔ-Kڷ3Nj>޴|E.aq|17'8jD"˅*T0*0D4*xfq$P7Vbj8]ĈKj=7O=Zey`+0Ǚbvc$ EN%:Vق`UIρKb}hc,;B5הIZSrhx_#l`kد="cg.llV  h)bY9OKKhή}uWaQU84 C5Q-ƢEkPuBHOF]+ w[X%MmKˆ푼.bO1 /5D8W*0߱&ܡ\1jkB+:8/.O<4[M(`^.ČГ±MjVTR&D xP:*gbLߞ2Rηàtc51R]Gd8ϥ;Mr]:Zgyb~f6в~~H"][tm-u"mŔ$,Ke2H2b)'$0##R"m@RI?mrߠ{3\.adt';ķ7{_a5aM`6}{NLZmNe)5|e2Oo=abrIy A^3PS??Nv"ᵝaYelX4*BB'1K*8\Fq$1q t,bY}xƇrjuQq"⿻1dθ'qҭhhy~37M1T|Gcn8;7`2NLhl0g*XF:DFBLK7#Mb$H[:U>eu ZTYPLT .h DBr#AXFS fu|EKN-̱h9X%: (e3pbd  3M"( B\ez@wA)VBϘJ$I0Diu%:`|DO'- Ǒ8Y 6Ca 9ÌJ-Ҥֆ*XRX`jmhǑavRzYC&54&"֘LSȤ_9LQBaN aB`719- K\kub8ht%_r0IʸbI-%-fTL0=z-XuOgol9, L*_M=kj #GT ٥X jcs1"AmwBY؏>07bf C3-r-gDЌiʄާgXi C jcss( kAa4A`ѭ|~%bjU/&Y:DnIAޜj/c9NG"zˇ5 0HQ{~$|[d&lwHSbIB؇_o)Y𓁹ŷY; SyĚ1v5z(T9y8`'>*nߊ\-qj"?wԼ@Az$8HYOM4~nrH(2.yb>D/ot^p4q]u -1%%ƴӈhߩ; o`[ >3m牟hN`lr%"CoЧ6e38:!B_lNz]d>?W$uS>RKNMBi;ԵZsNh ˟I)[m,KM "5#EWH^K;2u)) ϫutcXuTGٞ4׷涸Chʅӻ j60ҥK *ߗ HN;/ŧhL{C̋*V} IIzÉ(ya=rBtŶjzVfrV–c_* ӷ Rީ gt.0 4S y0/I5yW|V|71&:K,u8 (ߟUddMӳ~<#^J?뎥,_=$ ~O6Krv cf߇ N279rX9݌0G"*'DEL?=%+-J軯ȸāyE?ӏtf~&M%'KY8y~`EbEb]M^TY`y :*|,wZA{ϘJ9a9x]rW<f1y}0[\ӉNG{:[I0 \Ɠo-ʂzJ3AZ^jؒMY,d'-ܐOf.hoMBye@-8Jկkj~ӏg7ե{M\os~xIjD1BԦaPuLP 1@ƭ D2K{Xƙ 9rH2rI#*t.jMeI kFmrc!rWIZT)Xsg%eV{$J*%* ZPJQBx }[-8ؤ-|mb]ʾ-+&8LD$FYml=ebyM7|r),`ٙJ(Fl;CAJ^G\e ,Qb$FUiXQ: ^4W%EF2M 9£˛M#(pkpF)Ieq+ }~3 @$ꀈVk`r[ "%ѕL['0?iFgPXVHWx6?HjpgAAI9B&3L"TISs @OT)/CZEI>`tTP E#̀MRJ:UuHVԇCS?Z['kh+ߨ*-f_#BNg_Ϭ :9k'H8*3' "]O#LEKX:)=Dtd<3mŧ^էvv9ɩp{q3oˮn-"R9.%{yMWz!vjPZtKkS5TD tRGV(>) B;g3<:p1 J<"g5S3IŮQ X=8VԢq 6"Yly05 LV)-z:<.#usjlFYʿԀOC ~?2fRGjAi3;IUk%9>Uz':yX1o*&f;`TEtUXXaVk7**F"gF ;lЦ4Yƚ9!z 35b9XZ1I8sA;gܐuiJ)~SUB.݀jNFš_hպZ3 e ĿZS-X5e aW*R R_L+͔bK{Lѡ_!ut\Pg ڦ}6FL u ScM85+B{)ݞP!n񕻭l5@Heg&([SaZJO/JU]Z4MOWU哈Ӊ\LJ?O [ĖdU 1e`.E1 z`Ao '3e?2~%㘥k%1 wǎIW5Te{-!/Y ~ZHkG&֎rATP+QF$2eMxkuɵRB?#9WЄFA.ʋ7 P{:W:y`N^"RT23W% 5@7JY0.yM  ]j*iX -lsYsc 'S0_:]t+!`v*Yc)a.xb~Nb1h&amdJ:?8T=@pS%_ c:*vQْ)tZ w0V-*sM"CzXΝk}5'@ tlY8/@ JziyisFX ZcvI-&ec<FnXs%kn^kaXEppRJ3Q 3bǀ "-0ۖGl%2F`vB;1VRck?a\鬍12F0Ro Et=?ښuk=S!=+!ٷBjGnVGO9rվD^e*#uL4@OKt&ю[l{ G󤰣_x3y~xJF3&M?\]4ȪW Ehu:Z]VՎV[Wr7>phe'[8\3%,"/5T`F?0u::Ey*/o;¥qj/ѤNt:0ӻɈ{0Pr69~I~]Ɠo/ǚ3A|p+b'h@}@nֆ0͋k;'ѐI, 0O5ܿR@ |-FTh\QUi5sHXg ̘pSEbZ5n`x >sEh9YٔLAyL&eeI+a342Chd$lh7W;cLgWdK] :zY,ƴނx$Bp˫|v]na[E[X55lSO@QbwF%ZSPql:QxdɷWMEX^uk+0q;Zrws LS93ซ| q;ùIH Rm/_O|ʤD'SiCɚcGdߟӅФzϞ9ǻiՒ~}owDtLB8V*I='ӷ8X4C}:Xmռ-Sr:ъJJ#ыޒ8UrǡdA-I:ͬn.Zlm=\x(҅"]x(҅z! 7Ly!V:&z46FPDi5a>!E)p9r{/t&*5&,R)P])+g&J[E8JSĴճ4Ab([zUNqVi ̓e͙.FB'88`/%.0 V@k=Ի"Kq3(CHH~7Z4?%D)HeJ -LN(HqzYӉ͚N$,SىIFOW hAPpFF3t6Ѹ RH<@+'C]=x7+:SaFt# C!pBP.Ck4Q%82C:!8}cT`( FD.fѡ/vbz,,?yNEy,Cb` mpm&K?^^a,-Mc1Zt1J׈W3Wv2r13ߟ\E[z"$ ]}KAÉ7|}?r |'Ts$6\jn(-f|%eE++Um1z"3E&ź<-wy{aLȖ)|!OF0;cT >:BgYg8@Hc'ji%}w٩'*3'ZJm,@B,*iO'`yY,8Jc*8 =:ڏ&oƣϳ13}?][s7+*l졌;S凭{jn\I\3Ln+J:O9CpVJdq8h|ht} O2 i͚Pr.m T)4H*un<Q#Ȳ%!԰38(imR#4SY&<:2I Zm$Ws>pժI%~WdSWwn X80H ''4Azx}إ0L$]C9CYY^*߰!fng_zE2KGNʺppUJrFC9.ȿawRА[L5g{#J0i6Hιy]Z_; d>VHpTk|r$A@\V zדmT8^JXCH B(,\ix]*JRqB;UgF&|ֲon;m4:Dp}'ag^ບ BAFZa )kņ K!En (\Qxࢠs޷N0v(w>رfcl;]1xtW0L ki:Oiy.9-/59݌;jN9lMa~΅&}/׃, 7v:LIP$gF I6u\\3cADӦfD䚤Ehqk {@7n,QVt[X2{uMNrϸ*4qӜ1$]pR0m$hq #-6POf"n ݖZ&IM#x A Z[Рhsٞ1I@kˡxx䝵,cW'9 k98q[O]`3^F36?I cLrR8^E' 뮪=iy>ql8rj(p OT;#wg?oWIhUB;j+# x=߮"/WOϴr'˷ycqqOOQDDu\f\{lty˼?5-dљLeƤӿcmώξ6"K#ar-֖hK \).Onh ,"ndVz.]_\]W 6oswW?el8vXJG!8( AG~f@#2\i^ւ׾*PR08' [KL0W߸O +g ^X$YVJqW]%9J7^{/k[zy5Wi_k.@2 8g v#3!f#]Q ғH e59J"(&]]C%F9 UxqkF#VBnwu?Dt"Pe+K=xQkk#,4.c8x,YH  K`Z) [EK%vQi21d]LHk}ϬЖLbf҂/c SQkK[,Ja)  h!UvAB6)j\ͥmZD0$I,:Tm Re,h𶾔D*n3-T`i`2*K% r rD= JOt7K /rXR&uxyu_3eC"|=R gy߶0߄? Zsx<0󽞥ԏUw V.3{/\Dsd ySnĈN)pV Y5vh=Xև|"Ld CmOEaKsfyj8=\:lmFxIH7 u s)>)>¬ccр<\ȕpGM'NJwU$1z0L .bg&pWbm1SObAn)SWʌc|z%qVplꒌrPS8O6d磳amLZN_ a5|z95,kKƣ3ei˦6{axt ,Li!ܑ¿|L{|tn!oDMBgŭ1\r'z0EzH*`PLL~jGDOiCd(sZ;( :G=Wm:31l_̛ILL _ϞLLA~#\,lj?O=o< -ˎ ;ҤeQj2(rCyDP; F(֪T&A#y* / ?DQD))T:{IP  LVBd6Sf+( -iRsm*\],Z'H izC+b. 亪0g vk?E_/i7ni|ܝb V?,-~ _(ۛ\=x ސ)էyj5U j'ϕayDR73u:B6ALRD)y;30Oy;EdNhLY.4L":1f;bVba@lVbhL;H A}eu%U s˞'!CoQ6bZ]DQM[+>񧮚 ׊Z0f$,@_)ZNxF_޼di0eVg?wgͳz^EN3 arY]>r7OK1ruB&+>Yga~\9LZ8B [!Mb5tߠvQ;/c&{ŴۊV=V1LZ./&IeLafX91JWAEݝk[ץ@(UѪnP5`|i9"g5,G~l..5ZeGZWBJ &@ daZbÚ˕Hշ}x xߟ\-ծO݈5(+!ZIj>./0o>+-!,bߖ`⪲~>_ ߿}k $o76/^ YC/= tyC0afϚ f9%Cv~)}g?'}E1 @`m$,#ӷO._ Tgi/ՠ}6O8$b=˕V: ]&o{|ζ^apT Zdr (Z)\g6 ]/㋬ׇ|"#SB ƛ=r11RDj>$ ,Byl?]qB{ө JghC啣&I.i͔$2I"!3Q/9Um+y+LLi ([,qk _PpAr<=5EKU;fk]zz4J9ŊREWHC2R*!0fU9kfD p.9c @ ?c@ƯI:~qMRFyQڥv W 0'S&>>Bxg|HwV36tTLh;_c@c>`1(fL[a& qwFX )ʶ3U{ٶ{z+a`d/_he/|ztKPGI)ku qK=-n/4&Jc!9,;KI.kȬ iPLL~&ٰ>砳Y+6=glT0 8 ~9 (e卡iUh<7uYU!tyaK1i`qWz\- @ Gy89 _uv=e'z? ! ^*Vi;uoשo/.WVq#Œ+io2C5z`FWc2)9334w/;1}ț1pRVZ1nMZ qlvugKY[nueA86^0BW2) +gz,KϸzK '1.\ܧh#2S$3vE?X8\;NjC(tq|QIΧ Z 8R#֨tkŬ2r*J(C"-P aɋb = ']F-H3rFwx &n@OLIW#BIapK$K+l%E('t"qSg*fKӕ^T BW*CkPFS0_gQ,jQ7kU՘ajLVcZVcZHbH\ap-@r6#_8hQ"AW$W}qqsjNXc,C.Gj߼gzG_1%3 8<$ oQ6,y6{栗-Z"IQYyLů{}Fm("wttx\Lf)SvFVLXY5S,1`(55 kA0`fE)e{p c0cQ]왇$/qp輴ך\*ߧj^>Nd4b́)`"`&R<*8&GđԼqjeރE3R"EA'KPl)T-vFR̰!F(\3a^۟h BJ"lLj@M Yg?% 7AO-5ƞ*v<~!FS% nXue4Rو1Du<`< E`IqPUk͝5"AՖBgY 9ü)IYIHAAp=Y0^hͩFxꑐub 0J" ;dW_ 8%e %e:&52B+ɝg58k VRX%iQT8Gps^KDX/K-҄ ҵ9vX .agS;"5!_Z%bH DaSIV? &0ki q'89 x0$SP)- i1k!1G#Kf*QkS17įSk s̸ z-/߲j1Nl-oo08k-Z>. qF% ku}+| B"Dq1&h^ ^m^p dB(#AT\ 1'bbE8 #M~35=P;xn 1)WT=8?& 9AtTZФ8 xaJ$5/0Xst=wYc1Fh&\is5S:"7P ?߭> }g n(ж;ch f \o_^ i^ Ib= OozaڒL[OW[+&ѧЖb&ɵ-F'C4*n%-K<$.JUNe%iՇhHaլ4.k@"KlL[u3_Z48-r*)Xvל@iO-FP>Y$a'YH śM;?*8g3>((ZwGD[{c"B Gd]ޞ)gr_A :ܗl~ˁJ!}!+p0;1Gj(w=wFc\ 9 +fst0✢q׶V1uErKXK3bmR} mX_t;qهO}c>n"v߲1,5{'S\H_2߂\guڢ͋%O]6.ً܄O'~]%E>>n$63>~_()9 *7o,䍛hMbnz7:H?w tr1ĻncZ9Dc{ޭqͲ\)snN3xӭ}N%̻'#$zM4˦zлynN3xS-Pw+a!oD^|덓rk*\kY1G|k+&tT O|p;֝˜"V)*;p9%s:miEŔ}+7R4[5!~L5U m9 bg@Ch抿dJwR%J@gJ&< +ijo6WU;cn@mK]-2AP,Y%-^{U|$_ %K| @ݎ,owua%z4Ϸ+F:kvA~[~gx,6}jI'tߡ.)/+(K({!9m-?ʠ `Z΅*_5(V @ $Qqk~Q -ƭ鋂ya,|ˁX*X. cémR1]4fEa O?V6֡MSϏu-*ZUEmamZ C),%b;Pf44[υ_vy95Iu styMPe/F6o5'-}Ï6[J5Fxe $9fЂ*vʓ@(X IӚED2r// **3Db~#2w" #^O}I*,9Fшy[nDnw)x 'I$4=GÍ|^~>5= xt+-xxrXb*7x"T]wܤQz;A5 ŝK<Btn/St3UOc-RiYZKSGљAN irefy!#)&xvޒ1ќΧ˛M:UcR@|$/6U ]vQ*md ų)[dŻϼDTe}PRёu(J)a p9nکK)w)HA,OB0̪Ov]vUoW@-hI!W>`j`ZR*ךG=3(Z3&򨆩#'!AY}r~3|7U8ݔ\ uGt)gNe\N+I+fұa̵@@슩5.HH`<+N]r0F,hF^/l=pQWk]o3Hc=!B+M)%3*|J oBôkCaI:=Os5§Dr! 7§ҋi{[ztqjFROX $&TLU3Er^*P6)SBo/|4X2jȈO3Pv§c($3$|A8brUӞqyqgny^'bhp+@O;Uڭ`Vwƅ^W{Wno%h>[#cHp.nwOրAl#&r?|Vsɮ\_+͆O<^KsO,EjB*V[RPcH:z^ohA3PѺ`RiyxZqc7jBhZR""E-Q;pcV`| NqBym`1XmՑSeM-Ra07 `Hsg  p-Sp]oSvuye,nu<[l;ȻEhlo6~B͟w:[v}Q< "H2- kw(vS 3<. ,\.]H4E%뤏\OZն7z!pj~\alޅ]gD 'S$-!FdR❌/V Uᴑ2-=2`kJBt,bkZ1Jk) b`R9bg<k:$I}rXH\I9#8D==ARPvmߖZ޷Lݗ=jC/.VԷ?\Omuy׿}Nw{ίt~gj{_e7"9L^dn`lIr&GvRV[fv2eլ~X,/6u73C8?./l>%e{J*z]G[\W+cˢ~nKC \ "le r,Iq];PWR#0Èo^l;TexxFYmߛc񻸚^,y˅C?ȏ;qvoW}{=}5SʀM2lCMe枏jlI~)62;cڸ,_Ei?Gƛs0>=䋞~WwsWTl].`mC }x9,e!y/̦"(kE̞bfԱ>_I^k"ӗ %u '#\: !d ܰNZD` ݗ($Xb(2SHl<9'b3BR+KlAS)_e#XKh&@hBno{hYrYpr!(n55YO8kV([HuS$;t!U4c 14J~klhsCjqy]y- "5$2ڗ)@`l7@+QY#]Y%@m೰*FΐTP tdNp-Ո,āZj0;4"-65RbW`tfH 1I/Fi-*FnȌ 4KH6kjq2L32r)f3E*Z"r+"*ͦN3k0/*l5-U3.1ݗ"HR(u+ա4=dM܈ʻ׆y+To|Mi,FXqGZuO3#?:~%e xA.q',U߮WĂtNE*9p QP)6YؔKNY*4!܉NN M#Xr t?LsHX!FnSπUԘ#gӽN'Y9E`3E<;bc^2R/&*@<<$I#? 8F 2{+/@M{Nn^M`~*@P ݟ(y/5J*A;R,(CSJN2sL>d.ɳDH#INTpƢ;YĹiP&0 T8Տ"'A6|a04T %9qh iL|yH&)$ġc$!@GybVEm/las}t/e;ذM@uG2\qd)q+ SzVuT*|'1R=~ֺ7Z\Մ0s? y#`}=gܤƈ㻫[f~:IIbケ7QXRR.̦bH qFm2 ES(̪Z6IY`3rNKk< Y>ܫan^L A#C1i=F9&"ݟ_z| mN݇436ͺ.j}uoQ9޲G)uJ;nz1JF櫘uvY(Vx\,܅WnwL 3ZzT48CثbrM&}X n7=?qNjaP)rQE޺~UΛźQkKX휗觋 =pR*i]-4; *$?a=\N07gAo{ SYś8?[C-9Or;DܖX6&t16C ۙFΒ݁TrX^vw(S.3ab="Sm|\aoV=p[0N:Pc0`L'Fa& Ɔ&bwU./*pMAN c u((e&$CTJETj|pRC %^o11 01vۏK)AiVHK9%Z9$-kFОtLdAĝv~p4AniqGfKxhj/n yܖtNo'Tv}:76Ny |u+ۖwK{wa|FR%@玗cL9E&S: 6!dBZ[w-abFo%UQd7~l"#4h"9F@%Ai7Fs@E7!ҌK1X!Î7 g#=:Άs֌qs7):Rċ) J<ӫKp$sNCϏ^Ƹ?Gds4Y\ݻ-p/Rw\tD*zsD6ܨ4pDɻAh۪{ģPU~Nh [3LPq97jA@<:Ovz@1FN(Kf؄j,ffTĦ U@u`--)AHJJ3^\죐?ovKu39 Сr2\%N*##1JgE'ӌON18dBLkN[>6ER qRN@1VSLEgEP͇%dDGSQT%QdFסL !G~x@5/b*u+Sdww ?1]JBM1vd]o|YyZ魬;tkm[MgsϗxBzi%V-I „~]}}nNupKj6$䅋hLznsڲގ-Le߽;#D)>7Hgmփ+cm}ֹ{gN b0z^؋k{+P(?3yϟѫm|lɖ`1f(NgFT ʌQ -b V'9 dP[㊊,C˅27(gcF#Q[~ޣdsXs\Jtqz+9)m>WA fm7j-" ޙ|nmGH}`Ee!ur*2QLhSGWt%Ʀ3)-}^Gs^F>o~[["}݋K{9_5q`݈}9_9(|^$_Hb)~k|~N\Vd~s~{gߞM>"㧟ܯ~WbGWB@H.VC;bǨs$7A(zH.>0H y"%S&v DtbǨz󴔲ߵv/Xڐ.zB! ( R)=Xm+ n=k]% %7g_].W[ɚHKCqa Bp¤RAi5rQ!7EuTcAV;Gݴ{aWWP9Pqaz_))גNb2pZbVfX1;rwv!7DcSx*(j46%&'xA?xFé˛>pu+8Xf\ nm2v~b4=h fnG˃JOFIlLk!(tޅpg -g-Q)/hYqGIƿ-`96q2;z@ݪ5^6/sOX|h^ }o}JM'7$p|»+}pdϴpe̺oyr4U?=ߌA΄P8ܡ qRI Ik[ͿGA"gZQXQ:[gˆ3Zuy s!?JEnNAPnQ69ZP3_k8e=^2 RλJ0m𾛏~bMuyS/Mio<on\=1w>_~-i@JX6"<-M &?$zwWW z{>7wjQwd>ܸ=h#t?fe+0Q=n*BYsQJfWRmqR%/<yk!rʑpvmyx覒+ɇE͈ĸTPDSj0WضTzػ͒)o&qrg__ι +3DŹ?PCPDn5|quXruš 26_0z_pB4_#_^Gd\,ܕgn1@h4tX5ChlnZC#p#0#Nl)MTٳm^=_LZF0FpdUL'_@?9n)~xkAv7QRه+MĥjCPn̐y`ؘ0q%pqۏ{_aJN1dZ՜`2᪝kE,:B~jl > G>SE 9*7〺ݟ7 PcˎL\ ԥjʁvmf}w'+daM Q e$%sl8ill(Z.:b7}i1o0 }^^Dcs#4/Sj[hKu,$N` 7»r͟L;?Ggoҗ2qT&!N"zGyqX[u>H\PvMOw8?^_]a".Qc@n27+P &`\ GQ<|_,j:u}8|& @[>׬VF2֞ Shj9cLP vkGg睗͞拭!\s/Bȡ߁zOuHI#Ɍ~L~tv `DB! m.ZY+9:\ IMp?>ۺ[I;wnڥ8`WpܗĆ\7q_S?|wJ8~F 릳2C^O :7+vrV>[Gy_U~ c/O % ;8 ܟ=|/_^9~ҏWR#J E T딡ByċR:&+6oyFv3vndAƴٻkϦcA<\k\[[i>q_̷,F8P/&B}gяOѢj*1ӇO2ʶ ;78.;Ӹ鼨}|*nMdȲ+J^eZ*C:̭<Bd~f-k-CZRhJQ@E~T9 Ӊknuڄ7fd8' H|}uJ9h+Edž6U/ݬY{%5p{A~G=n@6gO~a uA'.C+\9íV+J-EZyZTRlٛĹFH3C>7>i\ ) !3Lyx!_㺤-S#\\oygC37nV'v6PdŲ6..``{uOxTr;yR-`"zפnܼʅpGC4+~+1([&IVx+5b-[<85u!oXvߤNn]DќXM1v i`|Fk$~0-[:v BxS FG{ڐK˨SU>4mkq:WFZo}D7+Dr2S!IG29~&Fk h\rG#6:Uy*Lo 2:oh`Pn e:e}IAˑ !}espWFy+].R}˃ٽܺA$~u1̻shʱ( ]ko@ƣ]yAgA mKfS^7_2)]M?M=` ]2?}%]IJ2Ǣt?)Ÿ]!Ct݇u-C!8*),=Q ʐY#J'SWjq=Ϗ k}lZ֎Nrt=Y26c۲=om쇧ǟ7 { ͂K$Dv,vrk\_"%( ӞHS4l3bò) jϹ) PPZ:`%N#U"cۡ%N ܯ=-hgd 0 I뻋f6X68/-쓞R_ PJIA%psG!@[)? o7IF f*~8u@ji_4ۊ|s@Гip$iFԷ?Rad |d{tT?{W붍qؗ/f",nv2f"x{}k&.ߗkٖmJ"%qMK=!yx^B JfFb AJ*%I1Qg2c::(dٕyMr7MkqIΕܺc)Z9rj-,f]sʥ.γ0)I0!x Up;:Tj) (kveKbntb'ƕozOBF(yU?nn[ rol矗hw/ud2ݙ#tР2 C7^ wa|Ce#&͟؏:hX?Aɺk7Y.% ;1o=d#V2 K&I*Ã#q[왼e/qA.GuLf?;kx ʡY 6giIb0̇8'L!zB&HrB5k+;>$8Ї2)0DDՓ܋I`8RdIj#L0/ʍN$TTcƤ`B\`0BR\)KmN&XSGx5∕,cQ55j]KL(`W{ Cr#dF4ÆVpiB~]aRf9Ð[ tZH)Vt %tgڬ=?_߯uj/"u< \S_k8 HN?oѷoѵ7 ZGzpu/{xz}bc?1GnOFG~x^ǦU+B.G1'HB:rMՙH @'\p n4@XxXH)|h`hK;K9@XD6'1FgAaz6;C@ t珬}u1&Fq^kuG[ԠZ!Nj0c0Ӥ 3>߫έQmwl A$qc: N?UFL'CMhsovp:I^tZRŎ65j?ҾJɚw @'ywXot]Z*ߵ6a@5txȑB 3:(43,WY0D 2%j^yvYyvu82MՌ5-RJ\@S1;I =fY5oL =e6 DL fdLݵ K lXeeB M/&!^9a]wyv58OYtKICpXC)-灡ҷu"Ve9a[`p`rmRbl&5-2xѷRKKq6]WQ?6^m4)3_+G3_{Kܤ\fyfM)]5{u/o۷G[Awz7s\e\ 8%h3d b|SLg_WnΛo?oO+}K=WDeѢ,|*J(M>nN}ZNwԲnR֭[hu-ֺ^9C nRP̸pDdCSl?~X6)K{=پy6`{ *eOԣ;g!Œrad}sK=w5}O_[,fw?5M-q'SK#PҨxLJpu3J: J읒F學(ꐋWH* aNRI=_ZSRÎ3J -ZFoR"vKvN}6+:Pj"SFb %Z 0S(F|drg%,pRj[Q5RHq4E,"|-NxN^g8iU1މxwc\=Ve&t? Hf}z v/dB7V"T6̳OvƳ/j%20ș]vfm9k%d99 Z`' oxo |H1-$$QB,.Zwb>s?ͪf$*s@sX)0& 28GW*2J­g8 g6; FN(;Jʃ8U H@Bzը쎀FAWSE_;[8*=|";"P]訌H@ +^CO,j`?ճ'_)u4I [{ϗBcX%\赴QN`&qRt!m 'y8q%=mG'D@ĻQ8?Nb A7G{@y.>.BT"rA1 տ|pZɐd ݞ@ V)h|G}KیfZ=]hЈ9Mcu%A&G+g&u@ >N렂"8t?p؜82.jR.@ I-:!KZ E8ͻ\g]-c${aUDe)&:f@[12cs8ڷ#io524ҐK.!mR~}=JNpmw>,7H @DTRT;wа]t%/¹BW`fL!D(p8"lW[&'zt [c) ip mz1. `|sL-m`&S&A ui\ۅe-,`doGq8t>AEB,h^%`pP_IC^́b#I^v3> Qv8vTj5r*/$]86 8E18Mt}D 0-Gl>m <[!Aq)'|&`Hw5LT3LӏJԠc 礁Jڦ9@4כ_n9~MEbϦ=8wV}U6JFW)誜mx  K kHmHI& KYtlpGad@޿l~**ܞ7 0(GIK^݁{ݴN\rPbc( 塯=xt|$ 06J8x%Rzmݫʫ*J*k`@@X$>"ʼnDĨ2zbb2,8/cNo BV݁j/I˔_v#\/jnԫu0i>wcd@pR{ڄl&bWIdXZݯ3y]Pz?77ԟ<ƈ %9-k|7}s !),O~R8Cw[ԝ_wRFsm,'9{2 U,g 3\!($񀕤VZ5ȢTQ&I8ixpcaT`N^#X<++LL 2~_=w!_|qg3=n 8)?`Ǝ'Ӂ|Lf\o'KG_꓏Q.)tΥu58X|%U&܍fÛp%J fz&@]wZ.؁ b`Ac\GPF9?bI~^ >`|UN`)­΋(L'aOKW <' Ay՟]1 3䰸%^ݞej?Ql]< ?ڒ>^fm Т?v?͑7r]{^-o8ieh9kv?Ȃy]\u  g?31Zݐofa2Lz\% d3׼9}?-]ȿaon5wdniȖL|:a^e:f<ƾUkq%)uX0>K,ɗ2W(gkȔbݙv>gEjPԓ*:Dwu:H-w>>EQ۔1x߃oo[sSR,ؐQ^ "e6q9O2' J1ܪsdࢰSH c'$>d)1gw[ĦmiM)?'e1 MNJ穷{BS0G_~䌑rkdu 3B ~2js}Z5TY7qblH;: EBpBP ^Ze[^,H.z5]Ai[^l.`쩗I6Oisll]=X]gA_U_Яl OË߀DM~vn\DKH+^vVVS% ̲ٝ8R+ym60yBtߵ휕j F)䬚qcd1`é$j())B5aXΘg)T@?ǨUV6_k '˕JQW"CcDȨd VР=gFE{) + x"<ck*DQ-ex-bĢc$sc:D\QccWIQ,;r$Y& =pIcHLo% r^K j$bNJlM $ 5 mϵA;tB)2AIK)-5B.;Ok,9mw%G^rPҪy7(2C\v+jo Y oːmj@k` oY\]]AUErGmJ˒+$D- g”E7uN_OsP? Gml6z<[&,&DwIk\[ٷCP$:˧F6.=w.qdgfDmtڿ[|>3m;1F%8_a}FS 0mH ֺ8AfQ~pp8}42TFTFTRF=Ze9@W3+~FW 7rst⊌ aK'\z4Xq/k;qFeN==q͂lۉkmրNPM ՚I./ \i)R6(3's .-8ޣ~+ZAr!4s:ZOJzǨ&Q8 q=»:qJ{PE5)rO*Ȅ&39 I`yES@6D/h Jj `4]=^n̪@\%pU֗ LTqhn@~}@#d2K?C`Y4T5/D!0^Rǘ"i6[j%WR#jxCu>[x7Oia$V`4\ єN*⬀)" <`喓݅Xs~[`DR$M噹ro :6*@aF -/SRP;LKvd.(\l{PX^'A| WN{ >D0:Vj(4{h|E Υ!)5z: K"ȇmEI߭CTauCh{sw8IP[?ҁB\Υ[=1U/]|wIN?' 4Y禋h(] Gf/'dW K&b&r%^RXxY+yl' $RLwyüaOטMog5DiE%`ϖ+Ubc|tm\j[&Q)?\ղV+±-H́P;eE X @A.8lL~g>d'M{1lҹD츷L ءz[Ϧt٠"JGwNWh{s7!rE+VKݩVnTn!;1O%GuT&D%RN٪ݴ'"㲗\pzр*L&ɭKO1dՙ:ٍ=d}俯rs@1Y g7.\_b -EYhm_5P4#80o% c@A@YɈc ˜$ bͱ0*PRH{ TG2]LdžGJцl,LmEFeΘVr:yZF-fR: SbM= cj{_-V2ߊ/̇pm _ lrV$yb[Ԓ%G$rwbHSFg5)jRhjB#ADW4 -(j~!NAIKOӆeNWE,N ny\]F&"gk QC;G@ō(#dV%K`Ɨ>N/ÇFgcf< OIČ/0}㱇b d)vp>z &8c x$$T"<(C" DLIly;7zݏ?.ӏQXhgW~C T钦G -c P4 BѣG燢KRr ;w/esGS1a8$AJGrkqZqG!dbAyRF?~[t)l9ލ՛c{f3?^.o3>fOh"{5Cڗ=&;"GH "Dr3.n\8~k$$L}vHnG~ U@lvM|.y\Ѣiϯw3?ֹOU ( _Ύ3t^3\bz*_IMq&kr,y;s 6֔̍yC:.Hh~/ T6&~u)C/qA瓯ּG/|HDY ˏd$xO5x+ԄL.FQ`}e-=ٹZ6TRte<veI EJ<U)d)6~HqgQRT0^v!2J!gR m϶˒zT ybH!JuRi~e-Aȯsx6 oɨe'/W>jV2ƯLC4>DSP6bz7֖rNg\D5낦edfr&@SQ *X?x~(ߒ]GKAaEoG:XL' EKÆ6TiMPjNeZ$1;P iJ|u*Nu)YMM0XVMV"cxK.~%8?y]NqRZ5 m^_Vk}j~ȧv6Z >OoA*ZpJnqxyMY*NiFAv)%+[D&UN"LN22%RS21GsQ2S9Y%ur܀j'Q5 `Zz>Ռv+AH0Oj"'ˍIOj٤ǿ$( a^BqD灙h19Jy*. AFikd9 ?=z17ecĠLqhѥSmL?~GR6ђSz`'٧6v'U|NyAضG"Qb6ed?]o* Lp-I74Q23nׂv!S Qug}]. O线DsrO7UBM$ZJOJ)thk$OV>-S0oHb,1:[_w'znq2t][ZO"S.D.^]׭Nrꕕ#ec #(TK g֥t @3 hec9kұJ)k,ΩI뮪hza`ًRh͉lJ~cWIby.~r_?[ThcWL túak[>*Ql?(gWW!P]Nonl]cˀelp>c .PՂdܪLh/زe9&L<:>A+hIω,:VVm4e$fꂡxhqN)&~]TZ,Ռ3:83L1=6"؉r -'%hZMNJ*H;2X(KIl_ AWG OkBS Vi*s]aȹn˭xl.z8J@1Ǵظ̷sX7}y@|3 ”tCoN ]Ju{76Fjհy|Bn[-ԑep HLR0{+f[yK& .Q01)gQF" j#Sî^/ro[QWa`ƹ)]}rn*t|3 '>nv)*;6-y鰷m~.f,v#66GvѮ '㻀qc]}%rSʦʞfwE9w:j36"[?"?zvdD&'k" M!yNr' G%PӚ8ʕ>sH\ky JJX0&TLwQCp!=OL%Q Y 1JvR dmZm s\% WZ 05gS qh^S+2y@j) {ќX1dMr,Np: 9PjHAqDգ*JƉ!%z(zH"Rz u9Y ӵwEiz_ZhoqLvxwL +p"ߑz/+4=Dn1a_la, &RLd:F4DǍ?,qBX|+w3w%q7ٞlĢ7\UaQ̶%E\*tu3 #|t'7,/q]1wt,= 6)-po, 3gfht>ǯpX7^ike?)O?-|˛y ~ !,gqx>E6/jqC5HS1c^R2,̩A+F;4Jsl NRBwIhAP'5)In,'Bfȹu-TPTlԍ]aԦTP_Vq@D0)Z!q2c>NĎېIg\ ˅ʣa3&ѐ/JNU:8˭7Y©,Ce^X(Tu*CerA #<%@xY)RsZSR_J@]֭JњIezqxx3o/.ǷF&, TwAQ \rsNQ?mc+{=b?qtq5Y~XHavu6 'ͣoR]). )t}T,W |\$NEt5˖R鹯A0V!"^\XUl3b_b`dM.~Ҷ]@%5tL 0Ad6,q J{kA){uqYL MkzepT%[ImM99}]\ILl8 $=wAXnC&cfF΍%mpJXYx6^3r4eƉryolF }^.ipy RqSoGt \nWS;:7EAum6?;S$)PL-+A RI-:CErxl9h@eO\ p揲* `Be6i[IW{[7KmEaJf@ L  "g֋:90qǽ0&57Q#(xC)@R͎ipEh^HID0C'F  Pg" roC#S7 4u=P%fg˷$i?>F:t?s=OE1uL4Ac=ie 󭌀%wc!6=-|qyY#4'6]ȣyWD#p;oz !%TT5'5nrq^ 4 7%co#ukY#+ν7Q-T{+AZɞg*ׂwxK8ٝrMwʴjӝ2çS^Tw툟kPY -w'7]wJjR l'<\( pcZoThU޼n/gJX Lj Uޗ||y6 4n:+ ۽ЪCJ۞v2ꐳחPj>h|][jF({wGۿ2lZV_jmT)Cd3@jCh)c"afbƩQ<[-W'l%ǹUk¯1@x}G*ADkBrmdS ~~0[SNwTngS{n%[hR >|()]ݟѻ5A4}Gv]bZԵwkInmXWn6%mJN6Z<:_>Z_/hNgF빟uQ;5wo>l]m^}6>+ \~kn?hL`[<`rⅅӓE3HEK$ D/Q$ۓ@$̒OɅS:=<Z,yd|̂샙'avꥬ]OQ/m[/eookB/)KbTLD*ފ5x]ɐCT7ZǺoxTJD=#j;qii+zC+jZ5NSQW6Z6Kv^W35+EM9`rdv\um%g5bY1*~ }5jr`dD)]1hߨz/X7gïC|-geA55|T]Sw˪y?@>dUFRk$ )R*j*$'@1cU9F |YQ;}st W$">_Dzz'|E;: 9F$0ErC#(qA)4ؔ)梅v @Ú%oDzՠA]{jdn{YelN612/%@Vq AljtԵ֠ȃA !O\:01#˨M&|QVը3 %[_Qs-ـSsT!p*LɆyYHJ|Ȯ6ť%ՒV*R;4L ~QLo1EH4*Jkeq㵬N5}dF;.,oq ɒӃXG)#j~Ў;>6(^\06~p)5%&a}bcY4s_d h?UȨU<s[ {jOAv؈챖F-1d';jb70._4sl7 bq,Kذ;(CqRRmJc\I% j{+IpSŞT|52ʳ2 ?j !_F֏-mϵt3귫^Ek>- VI;3S$QOY,ԭT !2G6x= VjO[HX:g'?ư-e-Fֻ?@_`dD;PO'x3mD?#D.SN(.2QbY L1SxQQG}$x <҇OcpQ"m&g9Ds!e%g[E B=#L'^+0Z;Dv"7[xRL!'aI<+ }]%ft'>0S# 53LO7y B>;(7&NS\vn$ Ltq` l=#ceg_2 "&񠌘2SSk7xekM3ڨ5=]捆j:cؒ9=c|{>/{gZ4s>\@5FCC@q5Eo7>L۶,z1E7~2S%NQtJ~ił!֮"adx"sV@Eu.NAMvuAa8P\0.j+(X+{SCUmTUumY%*.7WuU=_1 ^ q|T짠FqKԩ JHJHKɘB>xo>Z!PRkjPV-0U_e6ٖPPLVR| `{Vm:U/\ei!Ś@e;fY&%6Sqj۶gpPd>["SAdcZIB1hـɩ ew[fI JEIifJ+ *TkU䢒:Aؔ Ĝn:v+ZҎUP"4' E'RVD)F9 el;kh}Bڬ؂U1`A̚A+AdE,;J&J+d#4`bW<'?#qyYBW+9Ȫ*NBn <s%&/ )^"yJ -B(!ͭr(|Ig>9 / N>9^?4ǭ5]m'hgװN/\^TVMIɰ4;X[|잹IGd(3%) Q9d[;77??s2kޮ,[C|ĭƯyk­}ϺܰzBxc+#%5 \+Znޅ&i^*Pްڬ]_tbv3'.O޳0h^}1w:G3d>)x:%X)+ЇVKN[۔XxTE>@ l B,AfRFsF%v> K)6儣z)z܋}w?,-[6ݞxO7poV:R7I8z@W: Љ=gõ UK-bfhxUaͽGJQOm w" |E4*C"q2,{shYsl< dsdSS"嬍dN|]ޟ%NAp5˛mO~Rug%{~אЫ?]Q㊦@ds^,..x~3g̐`;7z:)XROa͍Q{f{޾"fŃN4(zFNFtr=mlޫDHn󉕔6;k冉k2xC-r_ڑZd5^峉5GǙYF|__T7;->vFϘƶ5 #lۗZSHFIe1WDaYB:ΉcuՅlYwͱP ,pH5Q^lեHU`Y>h=@aAO7?Oo-]ߵ$j/Gb2#e[ALFG?~k燏"n|׶:W߾1|g%~wo~ѽ,,U &|[ޯ#I8 7>a<1lR"&띌wy~q!c5O%p"cZ&1-J#{{>:WZ; "/ /# SP]E-_+L S}d좃 8&NEZn3?V[+^ח˻]SVg]ԧ<)ْyʹ5/~M'gS ;sѝ_OҖgK-}?A7N~n5<_OтV`~pf/d28cN1g`Fv;8:w1~X ~≩uC$.b$X$ ; :Oma͉(-ع׶jQ  FǠ=f{FxYng'^BHʊ$ ab xTW'a⽄3Y؂~p ,Ayk$zL|ʈj3D" a2Db%O3 ķ"<3%h'U5:hBl&guwQ"@Ni$?8cI6uV'HڃQ3ost2nC6`-G15mIV.`UOdhn ރ"Ne#Ĺ1!iX0=e{:FXG| m;R)@MWJ]g73hfCͨ߮vZ>-5 D`BKjK-^%j)pc<UtU >GQ $JN*g-sk dIPG&ܲDRH %:Zl.FZV[(dmEk底lRbL:fk$VpmI, 11jfIk=j햽oDAs>Œjq*Te8\]L\JTBM {! 0~jRPS%Akk]ա &`4R$v*BSPAB/u4:{+@ 1{Z Ɣ kKAXSJbSR zێ=;bT]Ÿ5\YD2}˄(l>{&:1}VwZk`:QRDAy4L@Q[=B(1jZGqZ;U#ɿ"KpH!w v~JdI'R^'W=E= $q,g.*K9*n~ sB"KҎPĕArA"A.JaPe B9 W?+PUSJ#‚{귫P?y^'EҐk-L|^~{h!m Vd )chzmqb㤎 mI6UamT`ur Ʈ@1 M*@dH %"2ߦ >[2c"s &#R6pl\@ad7M A|+DnGPtqKj6!+64fTgTOɧ[<@-:)FF'o@"ٍw5$bKQ6NX(~n.v:)stCJsP]ZI%OR+R,Q,#-|ZH$g r>xY|Al ,_\ۿa+s +˂QB)_6}M8@{N~9aHrmӤ=6.՛O/wיUn_1f~t]-=d Y?N8'w5XGsc.'lJ}_a}5$x 0mX"t(Sٓ?N ǽ)8t!52;֦I.̆b! eXP6i)ɵmP cݡ6udl"0t;d:<џ苎eBP 9j J ʨH.F' w ԜIQL!ܲф%=w7Z]!G&# XlD(s. /1Cm1a0t%$L!;!L!V d,Si@hI=*Ԁg\iڸ{U?%3],$b}z]fr2[m䌰/*pXT}[+yڼmjȰ(i* 4VsϢpM,6 YwQ,I<̨fC:3i>E=aG2as_XG[Y=]+Xp̄[?\^9lc4Gb M)D!olƗBw25]s={zY8]mF;G½k E^ !{ڽz)Wh7@"Gi0tKİ^ S n'I}běYOv^x)E1r"ϴQY%PxyBRTa\pWKag l%(t>OyD?i_]wE0xuZɷ@ z[LFa)$Ң 1:{6ߛE7N+gC?欄fgI2w{0_#n~l!u>v\?ȇ V6 KR.$Cx:у5f j\Vv㦥_3q<`B3sA"SQZ N%,:`N:oB\` r_˭eYϷ`څms~ShDր:/u }[MbTq~_~InuǛoM}$؈52x]Ve71ۿA(qj>AA1R"FfM_S$?`[3F$T1h7^>TrIP'Uǁn 봩ݓ#O7tJ}.4S 57N9iAɔPG萋JDp$!BY;J|ǁY bJ) S( I(ٔ!hm YR9k I+Ot6cR'G)]7hFs@EWawR^d3Ѣ&r*mtK0 .$D,:xYGg N t]U*>Ӌ-t}ޙ:AIғsAZPfzIQ'ö4TL/1R\1h>K/ӫȻ,W|Jg F^ 䏎0>۫_tfY[1~}U[`_>$w9۾Q Ua_Qt=Qsiđ䧝Tgh-GY9,챇v{w5'0d1Ȥ)U=](>s5Pi-ܦm}'-d@Es?h~W^i*:/z`+; {D4Mf?H0/`IB)^> ǎ5$q_`  Zkw wwh78"3LwA˽O !3LF=梏JoGHrswՍXë4:ύx>ܢۮcNawʰxz\\H[pn]>}jݩ{umywE EHˌ2y{ ,e\̈-}?ެm}o;Gq*fLygo4s;.$SxZ3W~$6֣R!!߸Nj7xniP%:K.tK<5zmpnvABqY$7MmI98{ωLJxn/ًɈ1%\\bCx4WUwl*JS V)|J)-d78DѮ?&KhoBXO~|GziPʻ;x<DБuQ\ A(DŽ@'rLRȑH^ "?g2+'84\M,썙Z1)}@+30LIi 7DNc7Z!ÜΤm7l2 Ƅȫ 77ܧhTL0Yi8rG c /T## Oo67orc Opϥ<|@l9%8L.At#O3]瑯y#*=?vv r&7@}~B$VLoe%Ԓ _ya3 I>T HeAAlqkްxB(62.#oo\6:tL\ $6L=|6 X<`VX"ZoWւSEG+Rn 08bj=F7vkځAzT1ځ( 3*]B|M$&4dȬ?j"bJm-dqoI\E_+k7^F~f>zVZrk|`T QF{p- W05de]߃5opV-+j#"s{ -u=![Hjs0X=G٤]^Eubw:T$$ʲ`$/ hZDL.@{=jTة|wC J(E  O%xibB!i|3AqAZhw(CevZ1dd2,h$\>KK!&%SY@i-S\@kNo;Jh Z@B)f$)1-sfL"/c)k G$9!zޕ2>hCnR6T͉ƌC$5rթF5*J̳l#nrE|%Otvs6\zV>w^rPd&wW$N\4wv8+Q\0v$@Lxo2Fkް#~(:cSLèCj\9~Fo!mz\o'aO[W@n}tWbvvy.椰nOS89}.YpDwږFcXsω$j6x]s ְ5l.@ c\'5Eu:w|?ĉ5,.9|`8)=P QR rF\Ùƴh45hY'0CpC3\ۮG/SXZByCf]Y]v/׾D\^~x]vR9lwnj w)7x| 2Kwm)Ⱥ2Rﮚ7 n봷߷?|ßZ ʻa;~3ÕPw{NЇ ??wzz #cKwxU  V#$M3 l 4w5@aaE_&FDt!:I㷰Cڽ~lt9'#!3?#L>c߅;]áحIq2w۵Kꌆ+v\be4"sBy:Rқp25;Cl6ѓS;cQ-` ()H+&BM1+KRƩsn ľzYbv~kWZQU-js TԥvPIPTV6hF;Dl\ͥ͡Pv [6\Ԙk?$:%ږ ǡav[}{IKil))e9Զ$- K]嗗aVh2/r?|’}|'ȗ/:CXA+$x>NJ2v 7_qp_m:2Ȅy&`̂ŨgJ? B︇^Ά78K:udLV8X>{;Gá8X1 D,8ҙ,,ќp+7cs"e% H6)°8MvQ-@:9Wof48SDTܨ Q;엳 }78 =Lk>'#E'ms Ў3Ol++#WF:s~5jJ̰(v6g.`ZňIWh ) ^תAP 3yF34E(gp URɘr.AYËSFd<; ƛK BĉA<0͖wx^DŽ n/% uPUrn:2^dl.ֲčܵ:cRN@;=L`>ە#mݞ I#]iLMö)TIuVyeJ AJqCleQaQ : 8:AT༹*M犭*]SsQFRֵߖ2Cj 3&/O35*Dz7 #gfuU,D)k;d)5Gc,DTKCG3BY-L(w} F1våw8N:>`zr(KWl49Z1[49@m,9JɆ1@7 -%ȡp'h)&@8iPjV$Ԕ(DVSQ|P85e:RRS"qbBN#QSr\jJ~?MPbcA&gLZy̥M$JB /iO4]mT3.9evsJ+G&hS63٧pۿt@kY0=%V^OtX-l(;cfQʛg/2QUeV5? mʂ&W8c;S3=dN^3K'+Fv#x>ȯK>'qyY65 :U ү^||hle~Vqj{J ,h3Rsj|,-> FmGVa:Zf)@yaHWɻ2[5P\ܶEXmߢca>əl*Uux"gFTfJYE55H4h?(rl6)*)]ST޼#3ryy׹>K[wp@׷~Xh?|lFMgU^5=T{SW9?|?]7"qw^mD<>|hqqka?_k+{go֟:Z_TۿթM>>ז5YqW񡿑ߍͼC6zmJj̇w_ ϙ9hp ЏBgym,NJbn;杰?72呂"v+tA3IB s kr3R/? n^S&[7' zX1!0AJafKH%}Ȫ%wӻWp(/a+F"^B lʼncu?~= -H!UP}n4*e& bΣ60j4 ΀"JM70XAkss ozܘCx(()ɺmt"sݏy0;LG4v'l:4̡d?Vl^vzfD³(ӏ5lQ"ː`UPt$~Mйqʬd3yx'T[Yǚ dut#-+`}oPy},}/d)\-ycrΥ*XTX;l]sV* 7`NtEH+<4yQ!₥*,AMbMʼA]՜7(]Q GLF-oS]%`%m1S%ZFvx;m7jqXwcQNx "H`w-F#r|>X[A`G% yORZ8pzSK<΃C'X8ůWp 쿢&)A݌+mB}흲z6]]hWsLNmHL> vA b$!A!jQT$N~HYtЎ]VSiNh1r)[oDZRx6BP n=b@&]4!E>uYԹ8svc k7HN"n*}rxDo;&DDf%X BuڝGX;tL+JqcpFsGn^}FGN3s>4sr>bn(`PzRp{([JJ[eUi8#gt&>PtqovVroN"4DU7[slX KBTζq_X|řƂ`&Ċs>Υ_z H_w5DjdrbFb圆yW"g:a>bSZh\XWJ[2ym+іk[sXiVRQiʥP7g)+Q JpLS6 ƉSPպFQ}|W-y#<Żhp 񼋍{BsHA@/xtDY]ezw>cPE #=(Q$W)!w+|i[ .\tW^Mo$_80x>ɻ3Kpq%'@JkN4*cp|Kij/mBRa_LҮ<]k3Eghr矰E#/O ]wá십1jLT?a_L(t]x`_L\ZE N$ҹt'2ȬTڥOd>QX)c'IY;1-J''eMˏfcڥRDZT2X!ʁmYD77YY߯äwks<ģt,=L:Ѽuukǩ#;#keI˜XHqSgkMŚ.~[ɯo$wS[?NqۡTW|NVU܂3!Ek#}mʑ z*>t " Rl@`1k̜EFu)4e)\˩ƞ|d1 Na>H*v)| vSQkXTpJ{ Oi7*SNϸQ鐾-'?Fnf;ot( }.{׏c/|bMziŇ|~e08;/x+oe9 8`֟vH5/Utw?yT54ۗ3Lp0+ j`]GRX-I3ŠdFuɨN&vU>34Kn,l@jBdA;~/)3A'K~xXx/̉|1r.9aL3cGFl)?'mܱTU8uہ%MuV@4.㍦29 K:r6v>Oמ% MYN "k&X:œ0צ|)9#S?7΢dvD?0 &o}J*]ڦAY` ]J_f;ȁBC1\Pf}lVF1k#4;G݃o!:mhqJ_/Bu/x6Ayb8DH=E%J"m%);u;u."(Ϲ*43LH*DFEF ޭq@2X9b*SX;(13x DpZ-<Ę!IQAڕ|k̋.O䀽vy d]Ԍ ɫsFvVBJnX[lŷV:_%؜s?)`J)x05hTn1jcݖVV?/OXiگ ;,N Sce ne fFJN$)r)ͯFܵKmrXMw&ߏNSjK J6⮺&2Vug+|f,lcBp.>rA lݘyxrIbckY/+dx*Yp1?AR7yqs'b:e "N]H_X,yl3IN4":`(J1,c9aN 7 ,׆7kgr.>v fmk;-3wٽYnlaL}j$Ct l`R@˱{sN%<-N9ݗ$o| & KY xrEHMRzo>έa2&g?w/anzrD of*W PTM#S)++U} 2҈ EZO%P}|Nqu5dCQFA[*Ɣ.EGЭW|ii߫l}G7{FIG:_,[䝷;owe,;չHL8Fc8F5Pi%r T|P8K.F [>| Lw˕~yo&YC#ekݒƂqR0bڹ<7HR1*^Ce i3.S!֎q0BH>9w}AGCk>C"G. scD `RMM4@cR92F)CT!lni:L`1cD.Ecqk4$5j)>>*Mk!&A-gw8KcesW`.5i#+{ N&*Ca U]A 5Կe6Q/E,\jxZ8 ?"KG!(5ε=;uϨɺgj9zSQN*WސZt|VqՔ#N䠦1Voo{s*ѠctQ<o}àZ~wz 7 39(G3t=?"q:E1;ǃ3nqR8 'ıSN ;l0/6)6ZFc3.Q*ns8PN%Ndx[@x )Ftblʵ@fVlPhjCtD :X"6mj|aM#mfU v~ŶbJv1ʩR+!PDޯA2"/G @}}|*wcmh,a?٩Yj֬$ϪmZch^ZřlQ~y#o?80byyUJRNʪ YY8?W0%5ʽiOlLF"No>jҧf~|f㿞b0_Fߢ&[onoqjf.1^Sx(WQk}`6v eN,JT]*""U6=Rz6Dοƈ-Ӻ[\Gb;[V-c#qnsryBR>}On8ת&DU zxqNt"- KO YX-IG^[> ~> Sh+SC ⦈xZmPrMΓ|m+ޙ[ h*ӂ*P|5ư᪽f2Ӆ4,4X@BkIM.B4,qdRgE('\BpJZ7@^Thdłv VdMot R-Vce4uSa\.9aN#G38CQm&t)b]<궛!FB&- Qv4l-_v؁cඛr\x 8Tඛ(Ti`b=-YMK'7h{|g` ,۱?M 8K7N Fət_bJo^Ŕg|<S{y`gQBJkVͮQt XL^{57P(](eO5e;hK')dmy$okx?\=;bTóZYz]5?7hpz@A c^6CmjGs}ښAAo;3h ^y4˄cD0F-GhD$f,S #;BCD fe$7Rup^m aхu-0۴v=QRM:e7PCSQ*t[0R)]aw &㥅eTKYkAS4Z?A kAilzQ67V4KηB1ܗJH}ø[6B~Sgf0Gq~pˆGXA>tޮ?}m-h]{Gysa*/#R Oovu[C?2}v "!iXxu*5؈nYWg,#8aȚ!+qH "E\*TP2E◠Md XtKkU)E&B1qDNss{sPEȝ z.gh-O_r3)KR1 q'K? uj9M$f7`uf@ت z63`6ڟ h(fIN@C0кĶĶӽu$5eTr5g4>;}?FۺXQ»HFua26!!cDn뎹fnghL]rJ㻔šPB&7 \+^XJ@%))o9I<)s$0xb V Yg74=&}d TcAeg F`x."YW pFM*ɍ4L|~VrybK/c:~UwwK3 CĞ"_, r=y"Zw^k^w zίW#Gft¡?7c"~\t|Vђ Gұ1Z%h-] 6"#B4nf$n蝷Nx::޹<&L2W>4jnKݻxx?ewG`NQ*- U8 !tX5[mRNncRW g Y ,1H~?/aiq_ufݵב2_|0E(-7*(,]!@aÔq\u%xtSP' ZHp4z^`9[Lҩ+3ثO1Kʻ}l4RYư!)~pvY?~7pJr݅Uq[#r&r5J)<8Ln͞#u{[@%ؗzڣM G$Z4I4KRuQg_.'t"~閖&@@[Fdz}΍~쟬fUh?iAqCI1^ ۊ$T\.>{odvvͥyى/v9%$ꨮu5,9n.K ]>:MUU Qj9@0ֺs9H9 HiX#:Al0B;^wMR:( /E*zD0Dw4jAR.*cmLyE$P ^yX1^BH!!6'@G&raljm&JeXD6"]QorNpb|`$hlGgo?%Kojy`qTX$m;"M0O$;63TeRF LP`Qb )rPF!gV,3̂LL&V˩vVo$TwQ[!ŠTt+[4|934*`$y„BiZ Rƙb;B PQ,-+{'PUY G*I`;(,uꍭ*/8"\q1sӁkfԤ6ԅ5H< D9EUjnā)6/Џ.ɖƯb9(o-HթolfT=k\eɷw&ejTS%m:u򺸿Yqxuh%InM:]<~I:-E'ܭছ}?q7Jykx1?*Dy{N+Sͦ4ͥ8ķ85"Tc\Ts}J_u4M)6c% VADo^h haoQv=p! E9>l/S,dm=d.>mqR$3l /Ɯ)/G+u# TOf[Rtn5΄Hc3b WJ!۱1Ϫϗ yLg {QT+w }ܾr\y{P<]GW!hOaWݻl骇:z`,z=`Xu"J',7

    H pV`DHu^Bt#}@_ C!B5Ӻ9)OE=gt/F+3c073%Su.FLq2RY Zfj-q"Ms"YH99j: C+QGPE *He$, .eHL&?aQŒ@RKv' TqGM)OI B#ȣ֦RMN9DY"F#$Gb1ro3Hp3W D$)-upr:8dbR9@霤 ~7gR"Fa('ELa1ǻ @)8U\@RGcLC#ps?c7WNĸ lՋV 05 ֬߭سo|5d8iGobQGՄوßVưkge]'厶dBE[#(/cz?>Qg2mmfp.dS ,c9)\`#}rkL/kz-޲hf>M拙pJ/& r?'QDu]YG"`] :7o-g &$JtP3w[ޮu mXu [V]>uhc=#>V(F[!k$[߳7]>ݻR#J(>׻?#}.-z TE i-A{ :G5{ HEV &y+CWqPtƍDwpEAUeuk5[$cy9?lZ!K Tln|__j:j!r㦍 4,[ ~y?Q&;Rɞ@ 261?v7>#f/S}B?/P4(fϯS[7͇|^Ǐw̋r!|1unں#)c=%[y䰴Nd%U./ZDTa+wcD%~Fivԉk<4лa!߹6)#]3zNoxzNK]{=[MD\$ˣy7&2z<G(25%]<~Mg>v[9:]}:LD g~,2y+-'O 33BN/iScs j:i_Caɷ#O' .3߲\zuqG7><./wi4wR(yL't_FOFd'׹w8{KJ=𳷬n\ m)~{pBvL5%4: +#c7qJ50M数trN37}f[ p`j2rx5sEdZ! j0$ٛ ]GhRFĂvy7|{p0^0"t=Ͽ?YT2_glyV[#{ߐEQǓ!v7g0s(4J%7C)5qPj"J moT y*Q(1D f!uԀ"sZ\$R!o E&Mה)6Ha_l~"nfqʱ΅2;/_@W6jRuHY\U j{6Q &J?=*)=S"1 هJ=3,{p&I1`&9kc7_E^?@-wK΅@Bmw΁:z \t]t#%: #Qqٛt!=n,LVn@C̪Z떹CƵUH*r$H&3IKD$S4 Pj(Amyaw)5P!%.ÝǚQJm],ERmŕ mNi GfJ= =4l&baԠGe[)0+PmaX)0+RKD\X⃰RR(_ !-Fj'~ؑZ^M.J R\5bBꇪԾJ/J RBB; 2"cͥϑD%S9δQ$E-q,-3n`J(rGC U;dc g>deXd~>JV}#[PxX5%]ުjG6 /R oUՆTUvUm4JvϺٷS0M>30K/0xwor  2!ݡݻNFۄH2d}[ONgk u{Pk'zW} c9$=jc]qaD*QR7Fiqw?{ܸJ/6)ɸ_\凔T&r$hk#KZ]f2:}J(@ iI*$4j'* o 3A@UmmO^RT"@c1b\Li"1ǩ)BX` j_Nb7m>}vTlStF ?iw|:sȑ#CvÇU]2'ӯIhYSwRKdzU@P@~epG敖AA\+~Zسc=|v)+ P ɑ7:,sK@ P|:q ґR~ݣ3`@ XW%ƭaϕ>²UEFxQ%+[(G{Ӧ[Oשoȣ3]]:6𵧍qOonaě,z7W8mz|ԛFv>߿ L-]R> &,߼p)i>|$K$͒~U3f&KNPZjjEc)u6}5dMXGvb=C_#"+Bo:拏Vk[֦'Zͪ4h*)j6IpZ;򇞏6:2+?-1{{ EUP=:s `̻@1^3IH .ļnM,1BOi4,XʗGNpƻвGNdi'mpy@=.x'r@wy@ .ϻǬC~IJsq( 5m.H!' "5A( a ޹NPSgܹ#A mkz}n^!x"(xf<jf7~r,Hfs( ?Yh @Hhu!Z#jqV8кPIp .D4.1nV,U8(m6UB'(i6UU.21(&iE;C[n ]|ʅT~m ]Db1Db( "1[9(!,@:T*,4 vP9 a>k4Q GBLs1И s@ 9y9S}qjo dc~5z&{{cUSUӏQv 4ӽd^5l^mdHBFXe ZW,<8ϩ?W^:/>lJ͏02R"!X-?ojlNJcf.M7m5l34A{t%oj?~tmE)BY;֞kjHZAaRK(;ҍWMAL,͖6iۗ2Qr+V*b2yqyM?,gc뫹NΏy׾\JV&q} VS'5TRO^?͖h~%xJcAUdCl bt<ȵ?ݥ{ 6d@7wkL=Бo x\9qtA|㥩?`B[]P֔-%!0WnQunF73hPHnF*m}r ܺ@,w3 y q9(au}&_ ZX#, Lb H"!ub|XH$i!`#7Ǚ|˩}=sRY^_eG רƒ3Q8MTbAb!"PFQQ S2T'\2D$Lb`BZth)\65e, Ddz,Z`H(aƀCEX,"ܷ"jEHٛ8R8 0;y aT)WBTCa(8KPFDH4NO(ע<plU@39=xiJ˾O4'fH蚎@MB"Lۓh%5qMGp|ʖ$`=<pN!6a[&ѓL3'X Q8)A;"ocߛY⦿~ .F9 TŶu.oBV % ӊa<یt/w!PF (.e@j#2^_^9Z)J u Ҭq[ƶ5S R([s[ -{0Nό.v[rRL7q\Stt<\l7S{G;uT-"Z>%=T^ AU+e86tB5SphK(i EB(wzEӧ_z|Z솥7Up-J~rJ` dXoӵrE+՚MB{sQ,}Ꟗ% k0MWF92oSDf;BvU UzSZWPYvim z#%uk$\xr)5/!.z;W~+Q ܢ=dsa;&xI-s_ -\-քP|F'9؜gtn59]l-bt5NKnLxS:)|d?Zrۡkx2ՋGZ,);{CXٿ`mx>2_;A>B޹zٔXn܊SNR[}Z";AuvB޹F m|ub:(35 Pf7[Mr2?eQuT}EǬ9ݎֳcg7<س'&%fjxӿWQt(86Y췙~UoCim&Ur'_9'B]r2q0~;͇ "pPvK -LqCGwO敟 R\%# f|1*_0r]r,g<+w(!{܏+=w>XOT=4#Ł'O!|@ j!@{5" ҂ԇtٜ+'H & K._vosҳYk`fڠ2nZ(CM]Xv0BŐPĤB*I(XPp&#WXF͙oY/̊uQ8j(%Fe0J+mT ie z׾.N?ʬ.Nztv{ZA rb{$4G5~%4f~\/x{fHh#5~yu!\gC~CLrKH-!$nN^"I‘@8ElP`Z' # )2$ JT0IB"BSppR:*U%ⷨT)y4:M"% 2mz"VBTDqJ"ΥG\RMWٽM5?g{cݨX3n%VM>4b;t}u8/n:'KX41A>G`t|KGrϧxlfw`۽lh Qc+I7V퉡Abh74s9c+۽YdY41>rC}v#j{ĸHY./#3pX\qroƳy\.gWWfK6ox-WZB81kgozOdN㞽M_-,S6tM7+p~??e{f~kgWۿ UFx"k-&.FF#NwDYkGXJ!*9*[J%q sQξX;g%l+^ 6lذ~%.lT7@خ>rC*8* )^Z9_9m tc-"$TKܟmvaސgy[L٠2jvBq[ɾ6GF ;X[ZGC+(٘ b]aCz 9g(Gr1z9-_X !l"iI%wv()Ft䃋@92懖wrZWhs]s|w#Ai%WL{J@|:OA2_EB$4ɀI2 ÁfaZ& H r>L*%Ucj|NbXZHLv&oa'|@QG8 Ta UJ!!b7;9"dܧݬtV1Puyn6*Fie¬*[k(E9zgeAd0Dx ̗e7@J.e7uݔ]!KӤnKL3~u[.8kx8SQ:ᒺ[T%ejfM V*cLӺO$r$my%;엲)pJδf xsA0wEK1!HID*%KeP lx ;˲ L ykLyklq$1'M)OA5wRb[̻X2t +1!҃A }5*DŎA1Z&Hׅ|[/BSo_vWm,(%--m KiKkag?hp-jhd^ď 8&,roi j rO.=R*xz|>ιjIs@[htȎ y)dUY 7+^6v7lK5m~6A< يn{ÃDyB4ڇޘؽC@Vw2<ޜZ8״uWɁ ߩ\p+HSLږ:# zJې: _M<n)b3%RN9i q=(戵&t%Nk飜<ʰXw! I^OO&E{i=ģ]؏{3r>΢)qe1P&JY cmĀ͑ 24QݎL8=Ec7F3~rI?zJ'# '$p`z~}=|$ :=:k^vtk VOR9'~x P _6  u~x38 -:|Ǵ-z ̏c?o;dSXmy4ss Nm-5&SB"&$4QL-gI >ϱre=| M*@M+ʩq¥q&Ntl8OXƩ@J-7q',&vT)@%n]3NFta搎o>ֿP&)WX픭#4I_)Lc\ӅJzYl ky|T׽qfgKw^nQ'oK=W~ϸQƙKae?w~;]q^0ҦIʍYl`0,]Fp$ƠF38ZJ_q&.c5n4EʽE`SEXdGܛnh;.hwv{Wx}vzw^OkRu]K+Ҕ8E `i*XS-\ı!"?ЄbN:_pV\\].Ґ2$QD˔H g% PeBbFlR.X;*f,f9X"(1UxWr ZI$e T hJ0U+h`۔'8`Xs p8F%q$ g1&Vj VM2rؘ:NKeB( M$0=-T 8 ʓr(Nh%IRpr@lXba0gˆ$)bDARF49%`WDG*NaisVcO1Gb z<Ǽ_[aQLv1!U6 =B#"0%3Ija9@(8aQDʔcSʵVp 0FL@෰cnl $2hDlrG)- &!T„-Հߍw8st`e 0|6s̾ݱi-l[(|o'-1}Ksj&!ގ\ѧNM% >Z̑w|OoY ~K ܁G}> ޵jNݝeMb ^dl6XLMHcոa (Z KXD| 4-!lέ6$@K 5Rq .{ް^`[7 Q̀RDmZ%>Fh5 !K-% p_/ 2rFtH 8=ûvP@٪R8L( &gdktE<vӝ ]J^vXZxZ8(&^J(؅r480G@Aǂ!ѹe%xńW*H׬N 1Ѝ|v)?S{x^( ,_$_l?+XR}ڀ} ~cVX]{{ofy3j>zXo}7|wɸ?j4l^=s41=P(P&.G(01`|62J@L!YjXiՉ :)"*,竗fҭSgݩh21ׄ%oK21IPT1Ʊ55Rkf q4QT#>פ 9$JDqB50 +"eYLSlE<lĺQpFzU؁h8^$p8oqj|˾z6ȴYV67K?I-{3? Z3{d'o<7MUQ1~97ZW0\AV ?K ~pIn_AqϟԻ ltWj P|Cm[fcmKUjǜ'n85+ &%+`0f:IQ@n(d5%X\Cʲ0Ή<&ͽDzPe|Ʒ#3Bv[_TB^?8LY6B؞t#70I)@”eL +=O fDʲOc 8M|ˏl+/:V9Veiku%r1oMF% uN!Ѣz32vJ.\׮ %I)uz W#ӶG }RgܻY~g#.6$gۯ*X& bǼŮ3{*o}$fEiAs .P wr*Sb5{QS;=!} ܦuj᩽Agp9'w蠾C)[Jp^Syh(1I&ir~.![t'gc1n NAURF9QjXB~,gY. AU>zx3^cdNPjټ=yvӝl:$L3tiJ9D6I% 8u I)Gwr>ۯ5~bGT”^s`PhZF}l/JfS{^'M!o3 n6&=G7χݽRYlؖ0^hbF25 *ou@RBrTj>4.N++ָvPQnm^c}\͔WOqA[n鑅ݽy?VZ_:ͨKͫ \p3F L}!&q%syܔV镳S1\mT51=ƒ.'\ܩu)2!Ђ@o~pdnO~4}X!DUQ:JIhq^ dz&Lv"̝{0#vbv_[*[r$c+8]@!(pRbUI0|[:MQu&Sj-;&4(8T34F.0QhuwY"QM{vAVζ4gHm)| kb=_ҷ?ߍ=ױ?k.L<|;^?ɵPv#%&[ |-8sT`XqCT(i3_!joiQb%@$VYm% 㘉8TV(*W{iWD5΋gƌP/Q9!vPB6ie)& ލ^+*dZ¶VBS(Ė)~yhηZPhN?cK5vh|usSsWb /K7-O^,WrF y=^2ٻZ֌upM?vT}JHAè+Jv!+1Bp]P9!Jdj?@:8{B'G+!rPda?Mk諎W_}J(i%cB+-н5.:-Z8arBs}8k~%/ vS]GY\·YPu톚Z$By"Z&aΑIBJd)aLlG<4+H-7`XMO;w}IWytf͊ [Μ-L?#S{b}m(L[rM `o~^D^vpjKD}9av")gU2Z٫KXB/Ip[%c+Y)SG ⏿CWk2|J (;]Lӯ'X<-?SEL+JIXSf"":/IPrAJAR+B^ $C8+H!9KeCب9-J#FdfuɖXQRyg= b|BW:OȠEw.,Jl ;.&%p _<\ҮgV^QP9kESXTk~#x-vU-pܥrI@偳n.F',5!vS%A'/& n{ϫ7!15E @!{g7E s9t؎Y hսb,pcm/u>䢩jDFI.q *PvȺi\AgqqZw7,MΓ^)?W*r5&H٪.J +3wqBH=;p‡PTri,ƂRd 0h -ibeIb0& YfT #)0fdY휔j>zXo}m\8JTH5A'vL6ީ߄dK9J+S` # )Y,VqXQ 'JcdT[sxc1[?/tI޳=4k|} BcGưjGcQAT^R{@`Lŋ\.^Zp$1J`$4%̱L W4I!)#L(c׹cc;/zCM,"fLǘGqnG$-+p 6 p<&x<<>f>_ ND : Z0Ћ-w#D)b%K106UvPj~7^z#?kwasȬsrpks4>̭y邫Y sn~}xpTtrC"[nu5M?]_(3BMxLg u*^A4 9SՈdE{Pj~9)$?="u@v"Qvapǚխ0 PWP̕i)LmZA-~1ԊKd8@GVRp2YlLԋ[J O(E xT޺vؾBSDJ@p3kL17R@*e6!ňYf :b%B)0A◛ v(%S-Q7u{fKpɝNn]1ifvFgsv.+©oF?Oخ]ayڷlrZ^_di{ݦ7 1FF %'ME^||۟qϙ_e~OOƢ4Af"|[x1y|}D2WO+kFVH(I"i+H('[nHW|Օu(gcbfW ɞyCQ]Em}AjCaD:̪JA)_^l\y{7{ '5^H^3x8GRT$jd8g~sAʬ%Həu>M2%R,%4MP\mQl&]n!r\1"Jd#-ո"Dl5.Z?w;LoՂVp5S Tꮭ,JiV +Cx F'G(+Ix*ޥ<ݴ(pye 'nEoo Tq_~e8ÍeKe?f_PW]:R! 8pE]gJNBNw[)X T Kf\AJ~Il+J7TOcgswZfNiݗ–>487?/7EQ,.&v5eby 8)?D۝"Jt}r::ZPx꧆q8vxB+?Cz<y*zC3 >[{ 8K~UrZܸ*.E_A.yހh6#̏?)& GDSOA4?Es@: u,3 O0ɭAl*LyeΈ`DJF9 ^_ӻ3mb@OjC" s߆y+`5~yE:"9èLVz(Jo_oU +a<3&EGG)0kc&e:Fg2#.qo]ReY`D ݮ.{'I #yB$a #ɜJDIYB}dV C{Hpex1&kdԑd 2"PJ1ەZ-+ANWT^te=VُO' ׃OxdEo7yk"nYL I4P&J`MMsHW]@ DiW@S[r{%o~ Dz 7oGi="$M[7)ȑ"DbD9Fpqn]vp~kK; l>:̍rK3Z .U jĨQjBـ[>Htͭ6Vm~w.n9..G"R㭼>;#*vpQ@+`).}o!kL=N sn8 |*ܹ-XڼC'bjBmjP^ NIRYix󽻹-n[%5P&~P77ö" 5U܁}5Q+?Neͅ`wU0z 5:#"zDzrXvq>-rRӂ Y?w Ek1 B =qMQ@z,DhU)&K>ea-&$Uh ڽt6 \k7/.$!pm$S cڭvu@֔1u/bi)ڭ h#*U$- =vkJi:p)Лvk^ڐ?Vd*hڧ}9黉AzwAL ./^b&կ^\K)xo郟njg/~y}'aŝ!G ,^fA ?o>^@{Hd;vIJsߜaPSރ۫J0778M(Z+jݑ,Ac|"ўY2 jM,>qe`=09~'VPRZP7&  ԜI+ln⤔QA؏BJ g)=E)'W%P>Uj0RzRj^oHP_UtG{SR⤔ R'9R?m)$NJynvRIPT,'-IB砾BBRzR ~)~)("R(RBRzR vJJ@4d]5"^* Ӆ~z[!3H"gsS؉UeAzP J q7wQ|7+Ȇ]oˡO-eWtn,Dlݻ>fۛt`H %Ŗjs&La? X"&XRq} PMAlSrg@yY79U}F#p%o}.\.^]a卞S=^sp| 伽)80$س]}Z|ZxϻkW01:4C;' :+k/ =ORl6tBnoDqp;^>'1qr=X<Ǿ/m0qg=$blStĶlG$i{@YM@̞i7A)lL׳5j]B3V*rx9K ĉ_b-kjFG rTb҄K=je[ZaKXf5RAZ齶 Sz5'Jf4td$jh5Vc`(ⓌjCU;$9͑75Te(W:LʄJ3e7Q%7A%*$$,P9Ogtk $=8]߾7` r$F6])h"G<Pg :`ύ'XbxcdC!ӽsr3 05G[ٸT[}sl+lg|HHUB:%`O c8Fa/I-JwwXԤG5`it-2\4Ʈ r;HH*Ok/tjl"2 9"Fr[)KSχ$tX-L8dI b)zO)edTʳ*Ԡعi `RYC-RzR#= , )Ii5;EFZ|G7Dz5JzSb2J,]ei[ 5Ck M.8m\{ht!\uv{)$V+`r;f^OAt~ 5xΦ.[%K1'w(O(A{,9goV=}hgMJ6v6iP轴=s.cJ˞6G#FJ1ϖ%+Wf$6.3궦l !WUNsKu8mqUO=&HЦbNv Ͼh$7P]HPŀ/̐_ x6I j2oTC 464!I&֧+j,enXZd>s2ϋTac(H`H )phCK2Z43Tf^\$VΧD(/Pr%ՇOmShmqnUMƠE"AP(ćsr|. ( _7_(FG7uGߏoB!L UޟI+?|᳗)yw&v6ˢ>;w Z`C(˃?s@ bk_6z%7,PU2Rvᚵ҇ j'o-}@hzJguͣ#IGp͡RE<ch:9SJw";Fr&zk'pcQ\HVRwnmkqxEixG( zx[ix krpkϰ:nP<;7J{zJG=g=;#7T0~\,\zRiO/[ri-../6FgNiU[F:>uJ397q@(Uaz+g9+ 9H}PD΄,.y֠\Sv5e⤔H fiiEIOl^M]ٮjh~2 s^UaZ:,f̧(dZхM( eEeGWUvX\$NQc@*7s\cPuX6~ov7l<}e`: v(Cc:"3Dc6ʕnc^qE ҬBYSZ%0w/?ϜQ ceVdf&uT6pgaxWʫWYS vYCe.5S)3㬑jpDKnxf[*\5@DinRTtK[8 IlB< u2"22m3y*YIz[a؞J]R*unp:uh4otV5,qXFQ\Ĺ9hσeN۵jK) J+jqjC)A@mmR3gږ0iwb2(am͆cwfl;:V,iDމ EJ*o@nCLdJ$9R$ja7Bw. 5FTUmEXg[357+n'=>~W'h8ݳ s6`D6=ϳ϶b\.AoZc;an =]mϽwmߞXߴ7mwݳ}!ѲBMwI۟zz(6#\{4`7f$8=] to;Ym {s :_v; s;4.ЅeHu^vuW %nG0#[r-Ğܸ`Vtg{ ɧ-UU u/f/Vipu(?zVMfSR9W%7w5[M)w^~ʕXV?i2Ӓ)2"Xx#y,m휯9{G ;TSP|<~I?Qڍ=3rLNKVxUV+.NK<>iMIHeZ'RA{@{3R O޲ iB9 Z[b4JCpT&RaP*V.(=ar;VJڽPzDZ]Pz(ÊR `(=dujR?mC)fVVke){Rm4fzEɇ Hi`T$e*pxV.aB$Y d S DH( \5Fw1]ֹ{nxͩ()Y{*܋~ *%Sx4W.]kQ<<ݵ`+w$a onEtUgdR-8W^0Bni^uGX M={eIO%"{릥6v*Պ_GT{[AvXcF`TP$!lQ&HLH!pB-6h &XڛIYWa,:EE49)l.8yd,aXdtDW5" ) =6 {TNWp|ǩJ!Z 1 LYF]hӑH;yߨ.STT4+cˢY6R׶sDd!h$2x m-9j>Bxk$պ(c|8I,2!d_B֚ A,L!A$,2kɌ TW -_zf6w%ڕG8PqbCX3e]ĩ0:WQjV֢ER@)G-yKYyYib|LhEDMrB?~(ºU5y맮fȬGn>>}{D\_ݦ~:c;5yY$Ih6YSh eHEɕ2%H,0$Wh$@3~;9Wy{y{$ss ywy#M[c@S1:OӁj:ʼGf4d$4ij]'R4RffK@ՆNP!ٿۖWw j9h^tZo/`Ң!~&V̈́d[vnpsϳӔv"| <"@λ>Si9x.%r@Ë0roG๰v_Am+c:Zc:z;5,zIp -uj!KåZB3%ͨW#vZYh!z~wgEWLųA~zRT%7GWC=-wlKZ5zY=,ϫ?\ݕCbyk鳭jsǗ|qEK=:^Ǻ|Q'q+Ѯn۔ۊxf3pG[8nho0HbjJbm@ڻ20Au)n-I8y$]^;]P*,% Ok)t &= Px* sT Tk{ڭʻD.(׹ؠ7KeHɿ ދ_Pѣ_7@k2j}g.P2> s'XrKcNhAΏn hm-B[+[-Q$BuSU~/4R$l ->Љ6`Lh1Xvݺ@șC),au-|HvʸIO!Q cztw7O!=ѝ΃]r<̡| Xȁp=!&}\A6*`& gzt9Q;ٍfy4V|@ҐI1ІY!L {bDK.ll!Nv2ؔc k4|()3$z']!3vk쏪Y\M~Q6n6\ƫwWscVvuF췛[᪀gM@8,05d+ '~ػ re-z'PGb[kTH!X &2a̘*:P-2O-t NN5-s) qIJHH JRRY%I2 R2ʍ:8hlzύ{6I7CƠa@= -#b^]7t#TY?\ '?⟆c>1c$֛u\pDrx/.B]9nT|VW4/w(B?%h<޶UmoC:' 9 /X < ('[~+3h>z۠Նiaޢ tM00}5ecKj`ƝwuLWA4HڳBڰsijlj uu_ںmj3W \ԆP:Y#]3G aN6u ݄͙s  GnXS6JP `͡r=&JY}ݴ JO(PREf(=duj-qA)7-f?caZ}SW?dU#7?͋2y}k}S}  1e@Y )s^rA LR1L sdYjPrҥ82\Лf)#֙MO?V] #j )ra I6Deyf2]Ŋϱj1|<iy8z[z[[ϵH#lwj37[Uaz4d攭:SK*v([9aK[oH4d^o6b]2%kc Cj |ڭA(R4Kʰ!(36M+ `4\ ZVj0e '&a}ݐ\sRsW]5͏-)!4ѕ^Y0kZP̬`QI=%х~;1`"#OMZ6z3K r"~ZaDej89[NdQmFr".RB/֤dEʉ >D 2ѹzƠ@pDs9v!h RBL>C_v;C s;8v'̅wag.ёzwj9tj51ڹ?Ղp-s6Z:9ՀrߝX>$|"nZmKR[V5;fM|V>2[oܮ^3LJidj4ӟ\52os[R=2viX8)w^P~r7+}~dd&SIjٗܽDuU!n?OϏN$ v?Z(Wgg!!)&?Oog7[|!mƴAgvn] !)b/ڏ"}n-zlL.E_n !)?^^ߺ)X*oT'Ju~ vOIjI.M'ܤ W̫=Wl`[}e i?rwG$LnZ-.\R?SN$PjNJ\Pz([$PwynmQJTަ4&JIZ*}A)-|PsDO!?{Wȍ1_8"Y4nKvp.v79q Kmp}-yԒZ6~$dfzEX|/Tta8(2.JŚHYDqQZK.RdqQ5ÛB1(EfxӗzyGe=G.3~-᳎RfQYst/Ky_j 缣4׌~iQCZ*K;JE)+<6(J yG)F\*g9R\zw_3Q|Q*M&֐y}R~1@@39h;/QZP,C/2{_5V2CM&N}6ƻ-6̀cX-}˅0!CY<(2w^Z40z < %D&i$#FT.y&2%p*8(iMu p38G ٨vSzT;L2+M#_}g} fD}x`P5g?1^}>Q|@&q(ůN—ΚAC?Xkg̾h:Jݝ*H; _EIFe<őo<#@bcZy&IĔynos@BI%O6e$ž|pF=s%_2`onjE{"/~spF0쿗EC_F).Hx@g#^ $\%vbb#@a!oE)df8OQUy6!{|0?K֨LDix vMA:|%2oڿFf9hs Ms8E`~EXw1N 3P14h9"Mo~zL.F/v%۝P9ݧFw_r^}fRNi7j,EkSdq['Z#-0.nu4êbM-M}矖۲rDYN=A=MLX:[,88HPp TJH%T(zFW#[I-e\[oTVoy ($㍒uJwdrulT9%+H"wˆI#_}%T %,-.P5ɰ1.nF+ŭ0ѓeL?d<--$g`{;(y;+ |a[ s#%a%yˆn\SWS-◬աq{l6C_P76?vahڍVd<DPV {<Q],-/L!K+PBTh%0Mnh*цF~6. qx_u =C)d $Un5?!D2uՂ#8/),7vXd@K5U<䓊cfI:Ȍkv>$h)e$TkA%SFjs`yaudV -TeȌoCС"U']1Csj Gʵ<ǂ1{Qypj-׼ Ŵ_@Qװ?ȏk.64 ҆gy,M;Y8wJ^$Kɥg"KƤC|90#E2!`J#qxkOrHxƸ`cm޶ m! xO6ƹa0̕99ÚVƘs@_:y&Z%C6ņ"5YmB@\K/8I  3B-!-ǯ(imCCJ,! au8DTmRAH@H78uqx{aKCprO6AbggoW?Kq.Ih2'PA3l\nGȪ,,[T } ;F*}Z3V?yM\;Z?jVOէ_0Ÿr\>ySג1\7wW|}#7X7VkLjɉ3A#8"V2Fgb#IvW= 0y3$艪 6h)mtw g+ ^ræ{fH)=%#GG? `)KRm4tWkUMnW_†]ܜ 73ŊǛǏk-$L?1R(ba+'wۚqo v13ph#^EiI%=m`Qݸij1_k0I,h>=Ov1zN1Z)F~>q5 \;N|_s6/ h j<֔f}GIZH@nξ@5%Vwӥpsԝ;:HMLI˨ .CG?j03R+dFcRlgWɋC}Tq^mNE?*%>]jB-\l܄_ݮep_?qu$PBolBnYRX-WɧOz^A(nX18Ԏ'"L}ݳ1jr}RSS 2޹чٍCvj6ݺ+0=Zvjwb&;Av:ݺ &t~#%MBݺ#?Ff>"䝇^LE6K̘AgZKuhH0MIR4XfJ4u%3A^i cEʫVea鞏*KG1oh=БsUe[u~bYT7a>e}$5䍛;WbeQZ\-?uudӿ;O>;n~sIh8aR R*WYhƯIp{{*v~?aAZVz/"/ȼy_9*-d/f\- /}.baWCtͮ/| gCy&̇&[/~xx髽_)CY,XPG, o1w0Ѱ55O_hX7W}!~ryk6LCh=ST;=yH6+iQ8EN[-LrtʩVRcjƼu(EMIy|m+x%vk0~Y-a,JK KMg&:_iæ~y@΅Ut2O$6fL3qxMz\@Pʻ{ %&c$Rj^[ RR;y]Uw\ */<8{yy"Qfl\r),{YbL t9(KˍgЎ{(*<WMPlgI{]%Ao7s߇՟tc0g0H: HBKCY.fdƓa2>ä$ &0~ ڼm/ lo #>޸R\1G#>i:KD}3/aqj Qҿ 5|k g\2hK'|t'|w.}wK鋰Ɛ;^\%x'qb8␎W=1I'@Mp&VNvWOO/w1 dc9\Iͩ34- 7Sm.G=<H"8=D`݋bQI_G8F\t8.RpR "x .5),@'sNLмw's})%P7`þ;km-?*M;:RTK-b꓄y@B^\}عdni'8;!NWĐtF tY?(?y!)L}ݔkGDfvFdʧ ];^2!S8t7LDkoU]m䶑+rKCVErp{@``AIwٗKHuϬZ23;jԣ[e]ˎs"j+qؽ~1iOv@]vEP,PZ&(+d(sfx=*Qa3,оͰDVӂЏ8`teHː]VJbʆ00:tڨyBf"3ZeYU P]WR;d!c㌱=23x&|:@6LcG0iR ID9e ?*9ߓ:U&)t*JҙVQr4~tYm ˰?5anvV$e?\TV޾?m 2O3>.~#v90※=Io~|=Bj uUd4nլߥdDkU΋}k}nՆilX*@I)$XzejzY3KODZT'CV/Vf)\jyuߢy^Yzejkg$KU$KB%:x+ۍ_?ߍSOغlڪݪ:TQTu&lD3=K&.:>K7aKHL-)9|Ozi΂'2dZ@{M8P_\Tl<] ,/$'5g\pEsm|uҀ(yC}(͙&˭Yd<ɐΤ⅕Khͷ.ٯզP@AuD>G+Uܽwͬͫ0d6 ИӬ' E)3JQ6Ü3Θ , ,#(N@28ݓk'%"[+M9C %> !g;)Xe!%@3\ `‚ Dw), d<9| E"_p'h7AP߀>و; BXpbYܑE9g0[*ΏK\R) [f(L.(Cޏ>-dadFB|4\1 i!?tZ Y`$aZ J y$0мb (vfV ٱwM65 4ɲi5w]1Q: &|})|C^8E;q0i~z;A[W b)mı ݺHFF>)ډS'0@ފnkE:Gn]1NgD'=ާfHKdt!/ NE&,!~DVahqhTda7Q¨i$)CdjI$NRMߌ$Xz,}1,50K?-o*.e)\|Ցgg,lk6 |2+p߉bQ`Y.2?w_CK•`P;3L:DEa)VNY\NZJt,KCeji05ӈc^6&-1dQp݂ij ˆՆ+6KWxi4x%>Tg fi |u]7֎WCV/VYz,z*q,_z3KOűTVkS`dq,zA3KOEKuLLAu-5~6KNq)I8VV(qfI!V3KXZYL{f8 VHLXZY-w,=E_NITFJ)TF׭K[lZ-ǥҸAY qI + Wg4KQı+r6 eV/ֵg"KfOr={"'Ҹj)KOtɢ낿-96Kc)U,5R.XZYpfiLKʂF,MZ)2WQڬ-ed02?qRXdvʉ)Cs9P,/E@E0 { 7[Ty@vk[lZ*$}Y?߿Iz,s>bl`X73v^0XQhWo~fǏj5b-uPdS1Sқ%B@' >tK+(|qJ鴆mv&T9E?Z %]nUgAQHdYv~vn, PȌ 7֏qIqae;u(A8Y9Y5\(a GqQ>(ˊa}];/瀕 cCHٿ9^oCQᄎn7`T=B&|ë߆ㅟRRٵ:>2Y)j~]t%o?~ }?gFg{{Gxwܔ*kޘ"Qs2lw1ey~s1/ҭw$0hcmb^߽D, WPhe/x߿ ??V> 1*/ۢ޽2wv?tտ'o_ξ2z>QR2?Z,^}ۄ2lғMIWS-| N}|i{bCFhAjص`uX﫪jُw3~_{%qw2Zj+h@9;5.y|3܋c{Q-禯vԤpvM ,`N!ɬ=[F‹wo<~vI_njkſYϮ?z1{s}wtU|o]Evc@uPn2}ln3^i@NQ|F5S%z)4`)Lgr>FL$t1[)i];?QImD Q[iZ9Sݦ9#@jYg;Y][ɛJMB3޲hKVKe`,nfߏl h-ﺦZQ+ޏk\~H5;_ϹФ'?M wW b>#E{>TrLԻ/RԻpvlE7bݺbCHn :O?njn3!#[ y)'?ގn)DP3RxSS0EB"[ ysJE*TEr_^"zԠjWUJOƺ  .Ǥ]]S-;M#v3[!Kv1g4Xn(Cօߟn9Ή^1Ix@(Xϝj.25br. I?t/tTǘ'$"@ HF=IWGΰ|Z568fKJ]sxXd4P\\&|T8Z~mf\GaTug4!J:I *$ .fLF*/5πXJKS˳p-u$zE[z0_2k+B :]% IgYɵ#S<`ق89[pbdXAs$8 Z4W}|o4_6'`]6)aig@d]`4!Gj nU KB ;]ˍf4V}k;S :.\4˚Q#k jUVu?@FzqeƏq?I #K*xwT2ȹybIm "GGw HJXt "v@> ; {GI(zcH~Q8stLh ^nq]`tk%nPMZ#=7¾!G%!QWkujFQT{.[Bʮ|tٵՃ{q]K1x:V΋G,?BQA)ĒAxnQ;pF՞k;N"|-0k{6 COv֚Ėu:/ Ȗ9:ovZpb#/&؋;Ďc/jYԠsVLjL+D5ƀjag 1}ɢ[nc+ Wvw&l S߷~;:pv&3K]N-~ʄG:8aAb:'_"g}07aj ^nAz'_hPb ~'DOVӴ$#ۀgjܶ_Q͗.e4*sWW~J*=3+iu 5#J$)XI*5Ht?FKN^.$*1] ሖ̹KP"R2~[b(ϑfLY4#*@!ݩ hmCž;b˵{(rn \)Uѥv -cV\-pU# T#sE8C+)ϥT,cYh1܌x\L=0DtI؈Y`*]c$ͱ2/(AKF):+S@ufc-ڴ?phvvl^Ϣ 0lE[`7L!pT֢]iq$ػKD=cGa-5 9uS(Тhz[o*nE;pQMu%uvCC s7#kN/3jP|(E;tU ыoGR03̗l](y9' jA./~uwzuju bܿR %_Ե6k}3[w+*hn cW,-?Qhl[3}W0TՃZ*;4YI Y<+,9δE-\Aةkzߛͪr1ib _pb] Ujb bh^ >)Y52)?cOk#!-`` A@fp_U[ϭP/ȇ? eDndAc%Z V*04R8# uCUXh@Ja]jŸfխSnn>u_M f 5&%Z{8c# 7ި:)ꨵ**(=Lo:&H'[PF9 4OS`F)i!$rэɱFT\ʼtr\r(Čۂl)T*jH)f =E 6#r?r׆ri y bD!us`RHښsqc$*su2Ê*unͮy`m^́G"|#h7 #|ӝ!N&RwqrޓBذA0~9REL2 cs ;*ofpkۻ{ςR5| 毫}W澏# ՞@u$%GLp/Q)[Sط-+C14$կ w "P4v 2@U&- Vhi Qe6Cny7jCuU];~S)[V40vna# F≿t h( SL z_IC]&4i^ 6iay0\Tpmé' P0vFLb \$٧ʝKčVX[r(*q(Ǟ?D ] j>ugF฽Гd*#J?ZK4sP sT2ILi$PbitʵȟK[[2'sB9ηGGF8v$6awc G:bq*&HFzu* Fc ZETk n=k'X ;"1fdhz{m_5:lۏ?[}er ~a~(cR$`/@ ]Rij iTGXS$ك3pbHXz DM !ҟr' ގp4P8OMpwCuǂP|1UɪN] >@,j0nJ3oL H )"AFt+"vx|q o 3͉YԋRr䒳sy;j򧫪>~^Zz/]Zݏn3-`G;j7doG失G3_wI8!JݷuU!DIYHOXQ&=RݨYdⰟ&cRe]E" E S[~+,_ngXdX\GbiXDsd]41'E<]Bzgnj&'S=V[#U.3OI(!3]hbPaμ(iQ:ֆaNǀ!>^AӁg< *ǙAm AB:/3}ai).Ϋ2Q B@LPQ 854O\.P_J r8\1GRH\Q3.;O}Ǜڊ,50Y\kŤq%ᖕ,,4WCsnHEi͏]V/c_o k}׹=T凿pWdNEn^@7b~H2"e} S|ՕgfAُxr; (/> sWΉRLaF/W[;÷sz"{s SjvV?{y''z.9n[ vn*1'W\S6b N"%_oW=v0$WQ8>P8OZ_~ ("C?ڏi*#EP>sҎ"ЧK`E&!I8pA" w̰91jFzx)\ H:Uˀ* 5]X8f %>^ -D#7 #a$#()3ftݹntw0,8p) A`g%b;R!Xr:Qc#Tk->9rh <=?Tx)0uBG*/?|̨eJkr*޼=j$pc8z)J}Su$oFƑذAp&_oދժt?xzGMf{[b\e>7eMn|q:lg;{UqwE36/[~墡j2ǴAjrj~ g¬r[v"gwY5wxYn16,<ݔ Sn:1gtng$C+$[ M4ƦN?nIxXN3xs"[ڏo[ Mtۦp`8Fl_y6HX !wh= s)R$I,(+P+3fMVH,J0(8'XTnFCg>I\[0 vPA$Mx/VÝY2[I t[`v˜1`=#ǽ.W ?.$?T |Q5v^PGEK99lΒ2wMp'ɧMrϬ02| eo9i5q{wЋ#wyBp/ݔؗoyu'6_MN>b:IG.7oOzyK]x}ϔu%_o߾ / 2Ϙ |7S[NX vqͺ7,Q `o~>] 3}iBHRjNkڨׯ1dxI\>[mw71Nag]#4}b`J:zj0j|6c~)lRraxWSգC>DJPr"3cK/ZS ]g'M[f s DOq# )Aa她 b,3a,G̰0XPZ"d1$H:ctADͲM%캢lrd.F85]b8r6Gqr|ێ[k#5ؼ-sЇ}>4cStD@:zÃFHrDzq?ⰾ`ÓI 0 vzϻ1~ShhP6J 8HyR~ I[1v"Z;#0NR\08p^?3wMY~ݱKh@E5nXO{?;#@w7׋+۳bP€"QYHdC =+(}hql۹`esR p]|nBmAeJ%T: bd|s)) ϳ>0]AsnO0'e)^\\¾~0pPlQ"̝;+驺 Z7XU-U\6BoQijDs^O[Kcmʹh|= Wv.s7~]k0Rĵ,٠n#heA:_w_)eJ):BE4EJb"m${I 7p׋?S&V^ lSY"}!w$ ( 2VO]{0[YpLQEi镅ԂFtm;}7ۣ={9T535IGk:'S%PUQP(9E{Ė29m_rlI5"99v1RAZr촦:ATtLd;&ǶUc %4 e֩X|-< [baSv`&]$}ƶU씯CڢrQ!vUA1LD4kecֲu[jt,^zR,K?h`ʣЬuosTf%)*Xҳ)@b87s 4{1i45,?Tdm"o7la_K{nD6 KA<.7!Iz>Nto u3ϋO$#x zQ9@Ր?B άT=uTSZK|4d=σ:f c3K3&7B5h`f<[;h~x{+!8ǰ K =Rw_B˳6l(!&!z=lOutF.q%%ɘ`D=z2n݅B+RvI /ث1Ќ `+©E"#-ZFcvHD3EނG ;|`_ ) NV9 z'_ɵK //'Ƈ7ެgJx(ƺ$>Jdl%_tkY3{3F~K|4=4Poofg!j./j]e׋KBJ.(|0xEt ׉s0C?qVZ%1Bcak'ڦka˼zX[ k"[ߝ_h~*,ae,d1*W·hr,)45p˫U@HPE^E01I]VFC|c@Ev }]IiM vϊᄚW~w}G+OϬ~suo eA'Q'^؞-CU M™ToE+lCNEOy$|퍛UeԤ {,Dv!&u'BQZXр,Ë=25vbCf 9ER6H^g[ R[T911%N0J%q9$ED[M+N40H#xG쥱6vtڎi(.ȾȺ*x#b ]Uf[$Sp9 0հ7ՐbvHNHi<D^̹g%";BZ()rEZQʬu>ؼb٣6&Mm,Z"#WV ɕIUzvNu5:7_o#m{a*D Y`a E*&9y1n.ݕ>t?\Tq{JXtFbw&zEбC"wl䱼qRԱ$b9ٵ}tDjUbg/ ՒT68|"JH*HcF ]weB,2ّKD >iugN>ZrqjW,׎%%OX_K7;ײ!_ceB9I^5ITWkrݧvjb mA[C-((AZF\T qwR'ZMvIJyQ-QAf fItg~ X[+.ĻJh 2 Q9% gXS?=jxӏ6Sݖ,]ȁM#:͇'m;,Dr2s,?yuڅB?"\Fhq<+"JYU|M/M i ?F1%eKV&2\:茲![Yl ,UvwjkT95nTe° Ⱥh2SÐ-1TޮWOszTSuT3>l=A 0FvpK(c-ajhlSԛ6#2Ҭ_wyAKWd\h  ֋um ?J{5xi,."vkqhU][oc+_6B"@? AdI6mI-ɗvUG=u4-zmY_YY,6%ϵ%Qy?g`X IhzL깙NfAǣ)J+oaWxuNv|暨5JFct,VJtE"Y{܌=D9(hۇ q=9J )ͮm=Dmd.`ZJǢO,GKibà]r80ͱh4hy:mZyQ0ل`P8Br"q;0Yzݤ1bCf(9DUn#9\}#;?b[TLHN:ͼHEc(We.30SN kN@*3+sNFù[cj/Z--1~Ԯ!<v&wlZoux eS"%Pd!2>>tFgJ.~bdN P)YqqR%s˔Y4Gi-"}9mgMD+{% G}cs oqƓ9rIag zQ: 1ơB3r̄(e%NSr8/){Xc2H-|<+8CtN L[UGUD#m88]{?gq D^(v88R$XH8M4Msjbߪ,(Dfa&oa葴]PhVh9K O6CK%>6Ě[5^2OF(8@t–ө!W擯 'X`ti]˻ g۩:5mSȼ.I T0i! [U&Nw7w>!vtpF ˋTi%s!jfsR;#iԹ՝etd>?cƓq{Ưawh:P)# OM*q+PсEr;ePX8 }ef6iH,^smѠY0SjyCqWmaw0>e Zp0thayB6Zs(Vn U0\8 G=}|\ЀNl""tE|+bT0DѰpGoᚕ0}Y};šS@Uߋh_!jk,:A֏pރr\xAޓ$ $ҝYrVfkKR<~8d:@T1vA֏P^gP|hGG=@cC`w ]U|-aZr^hkSbx76xbȌbE/,sDn8ݻd拇1g (S㿴vr[W]e:N: 8li :*̭+˿eaM>yk _:{م{TU]V}NjY=kdQ(YXH\V8r5:+'D4,e(hi/kmv9|ygD+a ܶ{(<* ӥm3ҡm& <x2VstZ/dt݃NYX"1EG6/ns cn\hN7)+8}-H7FGFTj$y|=Ҏ_n.w#=X辫]q H`GT|fHE Xtԏ-ЇQjSOW󂉏};%h̼o7J7i̿-{٣TnTc-|*e=OT0 >9%0F5 ܉  T,R87ud3̪j*_Re8  w1P鞗^hi_G⚿lYn%:JB Un+x'axMSi_%Suąte(!ҁz5)Aq=;~6wY\Ml^BQ($.t]I9Bu*x)/GcZz܄okr([龪%tQz/cQAI[06bn8"%V8G|Nɡff 9j3>p b NeEھ\`VHsorof6MMcŋj/1XU3p3~9Ab R9/dMPRz`.U`!e)7JȽᚌF3`fJ8цa^V:G@/X0<| a'G_~δ}e}Ƥ\ %ê9tϣ d F!fѸ{nj-dS*R.,J , bHt > Ƚ%6>grL~&or=k88E2v|{UVdZ4u!~ĸA#f/c pVU+ dbkβ*8~KoѿIZf[#y]|ld) %]3~PlQ#W{aGS|HIGm0~gA$ߡ$ٟE֛I^X%q+CcFJIߖg1|:th }|#;eHȆoΞo B=7LDŽ4'YBrkHqdRx#8ќM} VQU޹_hMIH-H\:+*S.;?8#`ō6ꬂR@E!%+ 2%%R?V9/VWyV,~>g<&ZDOyٽ^ ]9/Lu(R){{z}^[^l_mQ*e*FVXǍ*P}c6,t:}#=%y: Ś29'7NFYF.]lj:6*!vc=o15m5CFh䌏&Z4~ aekAĂ^w- \&㲆|qf`ϧ[¿O~Ywſx_B:ki߲dM'7%]ˏ^f b&?}r~yQ:@u9Z)~>^hGɐ^J|?P1{%7䆔ܐU%5f %4$- ^='sӕGQ`N\f %|8$Ԯy3X86EoXt9h6,.qeP÷E⬬/_:0cg]WZ{8JtTp?ד?͊ݲxcZbkI !=7&5euID^gY,tbt0JW'x4(7\D`їQr٧</tܟbvߍP:z?YEjdOxQ_w?c%ddUB1DMN$-]`ik>aK{9 M>ykba3LTGz@=W\$2ùe^V/K1 ]~ Tc8ݿ?V O^h0D0 w3cõ,Td-c-c1+;xj8Y^2+r"kj=OcKUb;gYv"X=BYh޿[e8ޱ0(Petl<Vg^(JG)IYRE;g|ise>5tbҹ] _.ؾJH+v**⨽Z>D*m|?+~]s=B͌{ȍki$*OɜSsI*gORd5=dO[܌5(r OZcOJbؔjOGށJF6VȬ%70ӨLF-4j&1v0 :ߺ k !߼Y;7z7on/Gv;׭p>3\IN@>GuHR=_i\!jyjv?!!3q`$Co/nxi{zd0o _9҉G=;2%Yմn;LY9z{|9V\IRoC13pD1h4:z[- P{-@}RE"mT|B[yω@98"N`e0sŢ*Nqߢ\#M^bRggF6sZ̎<|wճn=X}oh4zJ5ATYWEX9}?p9/@P}_PLCo,p;gFg[^t6 F1yYsyY;t3=/%]yFsV'/wQ4n_6·,fVF5N+;Wvn3=O燿b"޻G'Hקͪp?]_7ߏl'rU_G)+}s7Q/-J=8ZWe~/onQ]߱y-e\,1f:uJPG6)JKꊒ^W3ϻoyd+rlo\d!Dؔ85:L`ݺbc:]ƻuC{Uޭ y&ɦ x[W rL;xuc6R[ޭ y&M9eAO'j]=J[85>!*Z?(֍ZH!GaJLF%+:)!\-ō]ҧƸT- ez R4 G/xEp8d/ᱺR9Q2Tp2 K9qŎwQ/r `^ո 0E,m:~j-jrv>giKJH_Z[/I|]XCS#Z}ğXN(kO{}Vw h$JU j&YJ[G,I"L}>YqϵJˠ|t^'& u7aozXݗ;N2RRC蠣O 8F*U+r($HEtqXG1EPme)ǧyK|fhcADJeGq,^PF{Y`,F&<҂ƸbYJO S' EHedWYݮ( p彛Qf2\v&b,^}f ,V^q_DYY0N^qWT6 ]!JClv0GJU]hThxbJE2XzR"&zZ] zFVd%_=[^rr׍aHhltH(±^^hIWj)!lT HԧJ86핝% y8ؒFes^q_8yViLC|wJ̔0?i~jƮV1{K 5f</r.cE|zGRІ⼛Xrtl{ϗ(G#;NqMӝ.s>%:}O0'- FvH?9k 7"V;F[l[Y,ǒq#ﴳW]ۇ 7b;gӘ w83q ^kp>g0hPǍo^ܡB@JC޷`h/6ڶ:NA0-|I$`|^b]]i{aH}ɻQ+ 4pt}GíNIea!D;ɯԏ6+ݺbc:]ƻ.uڠڻu/Kn}X+7N6w1|\c&лu tw;dȍGsLwB^nؔɴ)3qL?7;󶞟ٙ7;Y() Z~ ,W÷ZdMr~yrPe>"h )©JoؖFW`9l)Y.n.Ȗ7w7#0P)\K!J^[W)$]KʬWr&Zٳߕ_٪EC.Mx׷Ne+c !RĎ؄x˓K!-Oˋ{[ ]\}"el *MLH(*J8]1%<!eUTw(+i8a(El CbQվjNF3"ᵟ@%ELrO,E-PWِBU@P}t.j 9"+P HFOK+33MAů>_o=5w5Q^2 o 1+"lQ ɨd;kxk4j}x+7[*F>RҐj\Umz Ū1޺]3 mV) %=5p Pw*JV ]5.dI}8vN?_랥x)fR6gge(u"ί׫9迺QCE:B}rqP VL}umhcHieg;FL3OZ)GqAˍ(64> VN6>!(+#`ɥ&!+ 8ԍ̝*{x0 /&ǐYb;2ufFTiT$( I[lUVLPF!,zY-;7h FiQ}}᤺cS8){uoGV^tqsZon]?d SJ m+5[Q 1y(oH 3im,8ųJz#m[] P;}o;, 𽽛Q&١V3teFIAdWk96њJ`x[*EdaaZ]`D#Cv骚9bIs,φe\.24oIzHC!(,V<եpR!maHE1C$ԷtWz{'g`[AD+յq,PVv#kЙYgQ/%Ɵb&?w) qp!_ >^΃+A**$_p ^MᅎA']z%jY[UMQ' 5Nɡƶ}|a.Kt+QG,ʧ< ?-UQ~Ţf$h2r]dzK_u78eAkԥ%)^"cdzNP @׭cq"V7n1^0Y/mzh>s2%KӖsќFaBB*m4<y뫴n2V'V ŜaB4@)Z=ۺ8+TYdZZ'X~kR-!EJ O-n}"9m L)*UuvWkC *Ǜy2|AlNP—ުs6ew`K^\EZWs4RQ&MLш !+E"kh̆cOwvcOj8u 8q?{uvEI1@jth#_Vcԅ|/bzMOfm[m>AhS;-MYZm9"3bvvէGa?F%`4jc8P6ڨUNYэ}[ C4k gi]-!T] C%T}UR{@hŞ=^ԬA5ObVFk+|}rQלh'K=5 =8b;8P=чrbSNɯԏ'0{Ѫ;ZHu|'8;dMMG[ޭ+9m!#6Wn6Ij&HYwV:h quМ䤶Z:́cjQ @8Vة@Lj1`؃ةgē1U>`@XB"`;STGpHA%J+ѕG$G{>7ęa*[AB*OJgv8c={MM@L{ƍ K/;vryTRyJJ5ٲu(v!50pHIL\,gt7 9?(B)фW#:(s!^"F|]06*on™w2// *u٫WXYE$qi4:-_9dm"Q*EJrT*=ŷ l 5wq{zQV8[x;%m5odeW}hE=hͨQ:%U2;Nݧ:,d3IAe"&/NOkSQ5liDFވ 367ٷk&S˽E} ;d5]Hy6 镹J4U cT"q+ci,!QMn]! }i| Ԁ]kB*ᣡDͨS: :NҐJ5X>~bחxw*S F [MT|BD?I^1%$gDO>E2ZC LaDW,m$vw(qa@KNP^;EŬ TV1&뺪cꥐIKwy* q"O3y:u= d,ƞ?ۋ_٥OA)}\/;IL<kjtY_ubiF+%{7ٟ>)-UNߎ>^;>LI^gDYDwVnh!Koo~ct$WBeܷ+Y8^Jҁ韧/~?W@ TAxQ:nUX%aUjBZ'w~rw~;;9E,d&rjQ~0ldIQo;syzݘ(ݬ8(TZI:&P`TFTqm5 &F-+(u4)PcUx7>İo~=nӹ|u bsQGF3GӨu^؟>47Z7Zyj,3uUrs&tgNXFmиz|w9[/67iKc jl,PnVj$NT RXZyy >`0K M 2 0e Gnjίs>rv~X}76v)L6LlODkbXh0rz2P ktB/p;j-}\+3N*n-dY|.-w@N+/sW8DQuLReLZJƈjKlV)+^*MCRn8g5HY8kjsްġ+❍E&U ʂ KӼQs:  3o>Z#r&,E"0f,m96LmkoNJAi- lQ@=o`3x>]a5Nh΄ J!@QQ9pV+q0y/v5fU4:Sf<\S-6)y;p7oguqnS0hJQ$}UhaTJ-|aI/QBS^,BѨjͬIڋ .rМCcn3T{B<zJf RP> dT7J͢_,h.~MȦ8Ļ [rL]i#bmۻWdz>,䍛6%84oixh F˥/_aŃ2z]93-uL1A{gZzdIBa8V, iL2*5C#2H4uJN 5FѬt8UZa.=[NW>ͨc=YQY$(ysLn+t~Y&c5mO/ca,p兿v-oLAyͭX:Z0O!aD $g@atl"D ,Le4esZ |c!.bG#13cĀKт#SY@Fb!vZ1@RK6'm6rE?]F`)p&MW֚7g[\ǯ|w;+yc5޳IM8I HWP5{_J[2[O\lqoMOSj@+*RJp?=zr29(sE}4]iCtD@o C()BF$5=l@bg^o%މ\s[I)B(<DT&"BI#̈́\gW"Zڷ6XB !(5u@`ÈP-t` s9cwZsWngn3[#0CnJy?*=x7xhD5ñ7bT8]Uϻx$XBFZƇc"PF$oz֛B|ѩTK8˜(;pudJKH~9x& 6KQI5w;>yz[AH()'nXbh_3=<i5TI,qGXm2 ;EIIjNJ@9 DKM36NZtWLߪ>9z666bPa7yj~\0U)nhQ8 YgsVͮaY Q͆-EX._78\Xg8&{g44u9U9. ynU R?sĭ)DṔj;|ڹj; ėFR7N; @[L 5%暋 I7Zf.GxԒ*#{o'ˣB%crJхlwN*b*Dt&-+O(2I7nE6$ĻԆw+A>w]20mnۻW/fz>,䍛hMI S^xRN3xNz"zܭIwB޸nnS m0朌 (gмUu9whdf)c&7], i{wbZ{.P5g|7%K"KXȽoBQiB_ޮ%5o }qkBE,8HԸ0rYIJOg4a>|>;Zv4=`eM_?M=>ġ^%YiڕIb,q4lA~&DH Q~W, (E3"YYR8}Ō T@([%Z=3kP<}>T6`k$!urK~F/_ \W$YEo #aI_<`I% txzK3<6-7:x{K"dM:!2l2Φ'>c,2Yf+DeLœhYH0Ov4:zBe @R3\\:ApVL] FP!|=\Z9; sd/N8{',u̽VG3`/J9Y)' ]Z)<+n[)|4JMqi]+H$A\F]:휈Yx@' ji2sTiZ7ꓥQ#cK އwe[*s*S>)G}d +/JtU var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005261235515137077026017716 0ustar rootrootJan 30 08:09:20 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 08:09:20 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:20 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 08:09:21 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 08:09:21 crc kubenswrapper[4870]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.804182 4870 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.814696 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.815565 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.815606 4870 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.815956 4870 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.815995 4870 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816014 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816027 4870 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816040 4870 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816053 4870 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816065 4870 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816076 4870 feature_gate.go:330] unrecognized feature gate: Example Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816087 4870 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816101 4870 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816114 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816124 4870 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816134 4870 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816145 4870 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816155 4870 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816169 4870 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816183 4870 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816194 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816206 4870 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816216 4870 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816226 4870 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816236 4870 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816246 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816256 4870 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816267 4870 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816277 4870 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816287 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816297 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816307 4870 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816328 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816338 4870 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816348 4870 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816358 4870 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816368 4870 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816378 4870 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816388 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816398 4870 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816408 4870 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816418 4870 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816429 4870 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816439 4870 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816450 4870 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816460 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816469 4870 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816486 4870 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816496 4870 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816506 4870 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816516 4870 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816526 4870 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816536 4870 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816547 4870 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816558 4870 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816569 4870 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816579 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816589 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816601 4870 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816613 4870 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816623 4870 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816638 4870 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816650 4870 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816662 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816672 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816689 4870 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816700 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816713 4870 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816723 4870 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816734 4870 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.816745 4870 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.817976 4870 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818012 4870 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818037 4870 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818053 4870 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818069 4870 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818082 4870 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818098 4870 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818123 4870 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818136 4870 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818150 4870 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818165 4870 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818178 4870 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818191 4870 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818205 4870 flags.go:64] FLAG: --cgroup-root="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818217 4870 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818229 4870 flags.go:64] FLAG: --client-ca-file="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818243 4870 flags.go:64] FLAG: --cloud-config="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818256 4870 flags.go:64] FLAG: --cloud-provider="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818268 4870 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818286 4870 flags.go:64] FLAG: --cluster-domain="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818298 4870 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818310 4870 flags.go:64] FLAG: --config-dir="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818322 4870 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818336 4870 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818352 4870 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818363 4870 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818376 4870 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818388 4870 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818400 4870 flags.go:64] FLAG: --contention-profiling="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818412 4870 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818423 4870 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818436 4870 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818447 4870 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818464 4870 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818476 4870 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818488 4870 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818499 4870 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818511 4870 flags.go:64] FLAG: --enable-server="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818523 4870 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818540 4870 flags.go:64] FLAG: --event-burst="100" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818552 4870 flags.go:64] FLAG: --event-qps="50" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818564 4870 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818575 4870 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818586 4870 flags.go:64] FLAG: --eviction-hard="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818601 4870 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818613 4870 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818625 4870 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818639 4870 flags.go:64] FLAG: --eviction-soft="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818653 4870 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818665 4870 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818677 4870 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818689 4870 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818701 4870 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818713 4870 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818724 4870 flags.go:64] FLAG: --feature-gates="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818740 4870 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818753 4870 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818765 4870 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818778 4870 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818789 4870 flags.go:64] FLAG: --healthz-port="10248" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818802 4870 flags.go:64] FLAG: --help="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818813 4870 flags.go:64] FLAG: --hostname-override="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818825 4870 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818837 4870 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818849 4870 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818860 4870 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818872 4870 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818921 4870 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818933 4870 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818944 4870 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818956 4870 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818968 4870 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818981 4870 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.818992 4870 flags.go:64] FLAG: --kube-reserved="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819004 4870 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819016 4870 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819029 4870 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819040 4870 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819051 4870 flags.go:64] FLAG: --lock-file="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819062 4870 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819075 4870 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819089 4870 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819109 4870 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819125 4870 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819137 4870 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819150 4870 flags.go:64] FLAG: --logging-format="text" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819162 4870 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819177 4870 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819189 4870 flags.go:64] FLAG: --manifest-url="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819201 4870 flags.go:64] FLAG: --manifest-url-header="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819218 4870 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819231 4870 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819246 4870 flags.go:64] FLAG: --max-pods="110" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819258 4870 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819269 4870 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819281 4870 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819292 4870 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819304 4870 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819317 4870 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819329 4870 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819363 4870 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819375 4870 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819387 4870 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819399 4870 flags.go:64] FLAG: --pod-cidr="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819410 4870 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819431 4870 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819443 4870 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819455 4870 flags.go:64] FLAG: --pods-per-core="0" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819467 4870 flags.go:64] FLAG: --port="10250" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819479 4870 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819492 4870 flags.go:64] FLAG: --provider-id="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819504 4870 flags.go:64] FLAG: --qos-reserved="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819518 4870 flags.go:64] FLAG: --read-only-port="10255" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819531 4870 flags.go:64] FLAG: --register-node="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819543 4870 flags.go:64] FLAG: --register-schedulable="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819555 4870 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819578 4870 flags.go:64] FLAG: --registry-burst="10" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819590 4870 flags.go:64] FLAG: --registry-qps="5" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819602 4870 flags.go:64] FLAG: --reserved-cpus="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819616 4870 flags.go:64] FLAG: --reserved-memory="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819631 4870 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819643 4870 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819656 4870 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819670 4870 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819681 4870 flags.go:64] FLAG: --runonce="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819693 4870 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819706 4870 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819719 4870 flags.go:64] FLAG: --seccomp-default="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819731 4870 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819743 4870 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819756 4870 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819768 4870 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819781 4870 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819793 4870 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819805 4870 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819816 4870 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819828 4870 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819840 4870 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819854 4870 flags.go:64] FLAG: --system-cgroups="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819866 4870 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819921 4870 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819933 4870 flags.go:64] FLAG: --tls-cert-file="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819944 4870 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819977 4870 flags.go:64] FLAG: --tls-min-version="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.819990 4870 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820003 4870 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820015 4870 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820027 4870 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820040 4870 flags.go:64] FLAG: --v="2" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820058 4870 flags.go:64] FLAG: --version="false" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820073 4870 flags.go:64] FLAG: --vmodule="" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820088 4870 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.820100 4870 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820399 4870 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820418 4870 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820452 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820465 4870 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820479 4870 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820491 4870 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820503 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820514 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820525 4870 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820534 4870 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820545 4870 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820556 4870 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820565 4870 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820576 4870 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820586 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820596 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820606 4870 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820616 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820626 4870 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820636 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820646 4870 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820656 4870 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820666 4870 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820676 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820686 4870 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820696 4870 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820707 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820721 4870 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820734 4870 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820744 4870 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820762 4870 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820774 4870 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820785 4870 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820795 4870 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820805 4870 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820814 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820824 4870 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820836 4870 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820866 4870 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820919 4870 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820933 4870 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820945 4870 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820957 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820967 4870 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820978 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820988 4870 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.820998 4870 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821008 4870 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821018 4870 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821028 4870 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821038 4870 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821079 4870 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821091 4870 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821106 4870 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821116 4870 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821126 4870 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821136 4870 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821146 4870 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821156 4870 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821166 4870 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821176 4870 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821185 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821200 4870 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821210 4870 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821220 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821230 4870 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821240 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821250 4870 feature_gate.go:330] unrecognized feature gate: Example Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821260 4870 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821270 4870 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.821280 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.822383 4870 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.836194 4870 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.836245 4870 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836378 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836392 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836401 4870 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836414 4870 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836426 4870 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836436 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836444 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836453 4870 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836462 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836470 4870 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836479 4870 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836488 4870 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836496 4870 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836505 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836514 4870 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836522 4870 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836530 4870 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836537 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836545 4870 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836553 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836561 4870 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836569 4870 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836576 4870 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836584 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836592 4870 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836600 4870 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836608 4870 feature_gate.go:330] unrecognized feature gate: Example Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836616 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836623 4870 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836631 4870 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836639 4870 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836647 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836654 4870 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836664 4870 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836678 4870 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836694 4870 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836713 4870 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836723 4870 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836735 4870 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836746 4870 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836756 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836766 4870 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836780 4870 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836792 4870 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836802 4870 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836811 4870 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836819 4870 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836831 4870 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836841 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836849 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836858 4870 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836866 4870 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836917 4870 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836938 4870 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836949 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836960 4870 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836968 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836976 4870 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836985 4870 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.836993 4870 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837003 4870 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837010 4870 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837018 4870 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837028 4870 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837039 4870 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837048 4870 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837056 4870 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837064 4870 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837072 4870 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837080 4870 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837090 4870 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.837104 4870 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837365 4870 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837380 4870 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837388 4870 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837397 4870 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837405 4870 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837413 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837420 4870 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837428 4870 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837437 4870 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837445 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837452 4870 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837460 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837467 4870 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837475 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837483 4870 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837493 4870 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837504 4870 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837512 4870 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837521 4870 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837531 4870 feature_gate.go:330] unrecognized feature gate: Example Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837539 4870 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837549 4870 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837557 4870 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837565 4870 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837574 4870 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837584 4870 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837597 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837617 4870 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837627 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837637 4870 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837647 4870 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837657 4870 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837666 4870 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837675 4870 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837688 4870 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837698 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837708 4870 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837717 4870 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837727 4870 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837736 4870 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837750 4870 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837763 4870 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837773 4870 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837783 4870 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837793 4870 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837803 4870 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837813 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837823 4870 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837833 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837842 4870 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837854 4870 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837865 4870 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837909 4870 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837921 4870 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837931 4870 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837941 4870 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837954 4870 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837966 4870 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837975 4870 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837984 4870 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.837994 4870 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838005 4870 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838015 4870 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838026 4870 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838035 4870 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838045 4870 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838055 4870 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838064 4870 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838073 4870 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838082 4870 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 08:09:21 crc kubenswrapper[4870]: W0130 08:09:21.838095 4870 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.838113 4870 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.839824 4870 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.846463 4870 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.846619 4870 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.848397 4870 server.go:997] "Starting client certificate rotation" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.848446 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.850338 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-19 16:12:58.090096136 +0000 UTC Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.850510 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.887282 4870 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.889633 4870 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 08:09:21 crc kubenswrapper[4870]: E0130 08:09:21.891251 4870 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.912787 4870 log.go:25] "Validated CRI v1 runtime API" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.961411 4870 log.go:25] "Validated CRI v1 image API" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.963295 4870 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.969259 4870 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-08-04-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.969316 4870 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.993383 4870 manager.go:217] Machine: {Timestamp:2026-01-30 08:09:21.98989245 +0000 UTC m=+0.685439599 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7dbac932-0e54-4045-a1f0-fa334c8e1b7e BootID:42bb4058-de5f-47d3-b90e-bda57dd064e9 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:91:45:e9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:91:45:e9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:41:a8:12 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:20:47:3a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5e:ae:0a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:97:33:fe Speed:-1 Mtu:1496} {Name:eth10 MacAddress:aa:e9:cb:49:12:c2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a2:f3:7a:55:fe:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.993700 4870 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.993914 4870 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.995679 4870 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.996010 4870 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.996061 4870 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.996352 4870 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.996369 4870 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.997250 4870 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.997297 4870 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.998616 4870 state_mem.go:36] "Initialized new in-memory state store" Jan 30 08:09:21 crc kubenswrapper[4870]: I0130 08:09:21.998735 4870 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.002158 4870 kubelet.go:418] "Attempting to sync node with API server" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.002183 4870 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.002251 4870 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.002281 4870 kubelet.go:324] "Adding apiserver pod source" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.002301 4870 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.006515 4870 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.007834 4870 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.008524 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.008613 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.008716 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.008796 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.010463 4870 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012473 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012522 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012548 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012590 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012614 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012627 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012645 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012666 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012702 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012721 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012757 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.012771 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.013938 4870 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.014319 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.014648 4870 server.go:1280] "Started kubelet" Jan 30 08:09:22 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.020327 4870 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.020495 4870 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.021109 4870 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.024671 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.024718 4870 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.024748 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 00:52:45.735490502 +0000 UTC Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.025076 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.025268 4870 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.025287 4870 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.025403 4870 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.026408 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.026470 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.028832 4870 factory.go:55] Registering systemd factory Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.028868 4870 factory.go:221] Registration of the systemd container factory successfully Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.029931 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="200ms" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.032919 4870 server.go:460] "Adding debug handlers to kubelet server" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.032095 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.227:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f73dac4603525 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 08:09:22.014598437 +0000 UTC m=+0.710145576,LastTimestamp:2026-01-30 08:09:22.014598437 +0000 UTC m=+0.710145576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.034466 4870 factory.go:153] Registering CRI-O factory Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.034498 4870 factory.go:221] Registration of the crio container factory successfully Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.034578 4870 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.034614 4870 factory.go:103] Registering Raw factory Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.034630 4870 manager.go:1196] Started watching for new ooms in manager Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.035338 4870 manager.go:319] Starting recovery of all containers Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038617 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038654 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038665 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038674 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038684 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.038692 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040366 4870 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040387 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040398 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040411 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040420 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040430 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040438 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040450 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040461 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040470 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040479 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040488 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040496 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040507 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040515 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040525 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040536 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040548 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040557 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040566 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040574 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040584 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040614 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040624 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040632 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040641 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040651 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040660 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040669 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040677 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040686 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040695 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040704 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040713 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040722 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040731 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040740 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040748 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040758 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040767 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040776 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040786 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040812 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040820 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040829 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040843 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040852 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040898 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040914 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040927 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040941 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040954 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040966 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040977 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.040994 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041007 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041019 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041035 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041049 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041060 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041085 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041098 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041114 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041126 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041138 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041149 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041163 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041178 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041218 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041230 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041243 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041257 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041268 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041280 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041293 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041305 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041317 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041330 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041341 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041357 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041369 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041380 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041392 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041404 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041417 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041432 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041446 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041458 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041469 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041480 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041491 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041502 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041513 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041529 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041537 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041575 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041584 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041593 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041616 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041630 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041639 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041649 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041658 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041673 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041686 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041695 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041704 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041716 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041725 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041734 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041742 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041758 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041766 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041774 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041783 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041791 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041799 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041808 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041816 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041827 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041836 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041844 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041853 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041862 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041892 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041910 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041924 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041961 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041973 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041982 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.041992 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042004 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042013 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042022 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042030 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042040 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042048 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042057 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042066 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042075 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042087 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042098 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042107 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042120 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042129 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042138 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042146 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042159 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042169 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042177 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042186 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042195 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042207 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042216 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042224 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042233 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042241 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042254 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042262 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042272 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042280 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042289 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042298 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042307 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042316 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042324 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042333 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042342 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042354 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042362 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042371 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042379 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042387 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042396 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042409 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042420 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042429 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042437 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042448 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042456 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042467 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042477 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042485 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042496 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042504 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042513 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042521 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042535 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042543 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042553 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042561 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042569 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042581 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042590 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042599 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042607 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042616 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042624 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042633 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042641 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042650 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042658 4870 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042667 4870 reconstruct.go:97] "Volume reconstruction finished" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.042674 4870 reconciler.go:26] "Reconciler: start to sync state" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.051366 4870 manager.go:324] Recovery completed Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.062499 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.064333 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.064376 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.064389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.065492 4870 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.065506 4870 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.065531 4870 state_mem.go:36] "Initialized new in-memory state store" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.068656 4870 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.073086 4870 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.073212 4870 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.073290 4870 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.073386 4870 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.076014 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.076110 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.080050 4870 policy_none.go:49] "None policy: Start" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.080708 4870 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.080735 4870 state_mem.go:35] "Initializing new in-memory state store" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.125945 4870 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.139982 4870 manager.go:334] "Starting Device Plugin manager" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140047 4870 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140065 4870 server.go:79] "Starting device plugin registration server" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140558 4870 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140580 4870 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140728 4870 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140948 4870 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.140966 4870 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.147405 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.173541 4870 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.173631 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.174761 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.174791 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.174799 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.174905 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175185 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175225 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175775 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175925 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175935 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.175958 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176736 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176840 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176870 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176887 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176894 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.176902 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177022 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177047 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177093 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.177926 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178212 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178242 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178403 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178559 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.178606 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179496 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179602 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179654 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179900 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.179926 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.180426 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.180481 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.180495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.231049 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="400ms" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.240716 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.242249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.242309 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.242328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.242366 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.243125 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.227:6443: connect: connection refused" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244581 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244639 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244678 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244711 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244851 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244920 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244955 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.244984 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245046 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245185 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245225 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245300 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245366 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.245400 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346761 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346836 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346899 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346932 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346963 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.346990 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347069 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347072 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347137 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347018 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347148 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347190 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347088 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347076 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347246 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347245 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347274 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347310 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347354 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347377 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347397 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347418 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347436 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347455 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347493 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347507 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347524 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.347682 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.443830 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.445125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.445163 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.445173 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.445200 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.445612 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.227:6443: connect: connection refused" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.516156 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.534188 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.544032 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.563711 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.566646 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.567563 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1affebc35dae28dcd241b2f06b70f52f5ecf2267eb04e63eb59eed161581f6dc WatchSource:0}: Error finding container 1affebc35dae28dcd241b2f06b70f52f5ecf2267eb04e63eb59eed161581f6dc: Status 404 returned error can't find the container with id 1affebc35dae28dcd241b2f06b70f52f5ecf2267eb04e63eb59eed161581f6dc Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.573108 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5e388a87f7a55b9df07599741c664a4182aee780698c958d97fb31d51483841c WatchSource:0}: Error finding container 5e388a87f7a55b9df07599741c664a4182aee780698c958d97fb31d51483841c: Status 404 returned error can't find the container with id 5e388a87f7a55b9df07599741c664a4182aee780698c958d97fb31d51483841c Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.574423 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-05af0ac4aba97e96a52a9eb545c02aec606f709da00cc7629cf49728273954c6 WatchSource:0}: Error finding container 05af0ac4aba97e96a52a9eb545c02aec606f709da00cc7629cf49728273954c6: Status 404 returned error can't find the container with id 05af0ac4aba97e96a52a9eb545c02aec606f709da00cc7629cf49728273954c6 Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.584135 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d5839aa0654842c1099eb16eb3c04755be1b93c37aff624f9d47a89b6923fde3 WatchSource:0}: Error finding container d5839aa0654842c1099eb16eb3c04755be1b93c37aff624f9d47a89b6923fde3: Status 404 returned error can't find the container with id d5839aa0654842c1099eb16eb3c04755be1b93c37aff624f9d47a89b6923fde3 Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.591553 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-705ac72741ca37498e805dcc6db1a1312d0c51768752f267467145db9bb8144c WatchSource:0}: Error finding container 705ac72741ca37498e805dcc6db1a1312d0c51768752f267467145db9bb8144c: Status 404 returned error can't find the container with id 705ac72741ca37498e805dcc6db1a1312d0c51768752f267467145db9bb8144c Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.631791 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="800ms" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.846578 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.848160 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.848249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.848271 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:22 crc kubenswrapper[4870]: I0130 08:09:22.848340 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.849335 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.227:6443: connect: connection refused" node="crc" Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.908914 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.909082 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:22 crc kubenswrapper[4870]: W0130 08:09:22.979919 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:22 crc kubenswrapper[4870]: E0130 08:09:22.980062 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.015496 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.025553 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:04:54.209445878 +0000 UTC Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.080758 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e388a87f7a55b9df07599741c664a4182aee780698c958d97fb31d51483841c"} Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.083487 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1affebc35dae28dcd241b2f06b70f52f5ecf2267eb04e63eb59eed161581f6dc"} Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.085094 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"705ac72741ca37498e805dcc6db1a1312d0c51768752f267467145db9bb8144c"} Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.086140 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5839aa0654842c1099eb16eb3c04755be1b93c37aff624f9d47a89b6923fde3"} Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.087216 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05af0ac4aba97e96a52a9eb545c02aec606f709da00cc7629cf49728273954c6"} Jan 30 08:09:23 crc kubenswrapper[4870]: W0130 08:09:23.169107 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:23 crc kubenswrapper[4870]: E0130 08:09:23.169364 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:23 crc kubenswrapper[4870]: E0130 08:09:23.433019 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="1.6s" Jan 30 08:09:23 crc kubenswrapper[4870]: W0130 08:09:23.532122 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:23 crc kubenswrapper[4870]: E0130 08:09:23.532234 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.649949 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.652309 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.652388 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.652409 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:23 crc kubenswrapper[4870]: I0130 08:09:23.652455 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:23 crc kubenswrapper[4870]: E0130 08:09:23.653306 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.227:6443: connect: connection refused" node="crc" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.015737 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.025748 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:00:03.122819448 +0000 UTC Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.028992 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 08:09:24 crc kubenswrapper[4870]: E0130 08:09:24.030050 4870 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.091850 4870 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d" exitCode=0 Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.091928 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.092106 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.093376 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.093428 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.093445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.095201 4870 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fe9b232957f2eea82ca2086063aa00fe190428df468751e40d205478af3ea9a1" exitCode=0 Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.095289 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.095380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fe9b232957f2eea82ca2086063aa00fe190428df468751e40d205478af3ea9a1"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.096568 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.096620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.096637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.099129 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.099226 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.099298 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.099387 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.099204 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.100468 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.100510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.100528 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.101159 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e" exitCode=0 Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.101233 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.101380 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.102538 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.102587 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.102612 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.103200 4870 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03" exitCode=0 Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.103259 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03"} Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.103403 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.105270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.105301 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.105312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.106844 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.108154 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.108227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:24 crc kubenswrapper[4870]: I0130 08:09:24.108312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.011917 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.227:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f73dac4603525 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 08:09:22.014598437 +0000 UTC m=+0.710145576,LastTimestamp:2026-01-30 08:09:22.014598437 +0000 UTC m=+0.710145576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.015688 4870 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.026730 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 06:40:29.991810548 +0000 UTC Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.033710 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="3.2s" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.110295 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.110345 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.110358 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.110371 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.112495 4870 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9" exitCode=0 Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.112547 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.112580 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.113706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.113733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.113743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.116218 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.116247 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.116272 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.116317 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.120242 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.120286 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.120306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.128048 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.128417 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0b968ca52ac44d69f056bcc02f5e1b2ad03c9700bf54ec25d893c936a721f595"} Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.128482 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.129689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.129717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.129727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.129978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.130004 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.130020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: W0130 08:09:25.232134 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.232205 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.253998 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.255325 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.255388 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.255398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:25 crc kubenswrapper[4870]: I0130 08:09:25.255428 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.255956 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.227:6443: connect: connection refused" node="crc" Jan 30 08:09:25 crc kubenswrapper[4870]: W0130 08:09:25.690443 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.690552 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:25 crc kubenswrapper[4870]: W0130 08:09:25.718437 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.718527 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:25 crc kubenswrapper[4870]: W0130 08:09:25.768923 4870 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.227:6443: connect: connection refused Jan 30 08:09:25 crc kubenswrapper[4870]: E0130 08:09:25.769006 4870 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.227:6443: connect: connection refused" logger="UnhandledError" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.027295 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:23:54.532041702 +0000 UTC Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.136816 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596"} Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.136969 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.138221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.138268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.138283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139524 4870 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd" exitCode=0 Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139595 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139619 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd"} Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139642 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139696 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.139599 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140652 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140674 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.140912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.141034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:26 crc kubenswrapper[4870]: I0130 08:09:26.141151 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.028245 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:01:11.674842351 +0000 UTC Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.145993 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146466 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572"} Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146500 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973"} Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146510 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf"} Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146520 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e"} Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146571 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146598 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.146985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.147000 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.147130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.147161 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.147170 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.713962 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.714160 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.715533 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.715591 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:27 crc kubenswrapper[4870]: I0130 08:09:27.715614 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.029043 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:41:15.413907152 +0000 UTC Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.036503 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.080933 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.155833 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.155827 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2"} Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.155945 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.156672 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.156742 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.156769 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.157341 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.157375 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.157384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.456636 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.457853 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.457916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.457929 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.457954 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.552628 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.562604 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.819352 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.819751 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.819822 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.821237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.821274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:28 crc kubenswrapper[4870]: I0130 08:09:28.821290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.030079 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:37:07.309467741 +0000 UTC Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.158621 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.158762 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.159988 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.160024 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.160036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.160779 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.160834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.160855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.971696 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.971862 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.971967 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.973397 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.973478 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:29 crc kubenswrapper[4870]: I0130 08:09:29.973504 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.030336 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 08:45:18.3674693 +0000 UTC Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.161093 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.162234 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.162288 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.162311 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.416062 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.416425 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.418191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.418236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.418254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:30 crc kubenswrapper[4870]: I0130 08:09:30.807410 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.031427 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:10:28.634195557 +0000 UTC Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.164126 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.165284 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.165350 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.165367 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.834538 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.834774 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.836552 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.836613 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:31 crc kubenswrapper[4870]: I0130 08:09:31.836634 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.032112 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 15:38:02.408900694 +0000 UTC Jan 30 08:09:32 crc kubenswrapper[4870]: E0130 08:09:32.147492 4870 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.170948 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.171122 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.172460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.172511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:32 crc kubenswrapper[4870]: I0130 08:09:32.172530 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:33 crc kubenswrapper[4870]: I0130 08:09:33.033138 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:36:10.106358132 +0000 UTC Jan 30 08:09:34 crc kubenswrapper[4870]: I0130 08:09:34.034184 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 06:07:28.414489924 +0000 UTC Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.034941 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 06:11:55.209336684 +0000 UTC Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.171313 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.172032 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.883424 4870 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.883734 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.891546 4870 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 08:09:35 crc kubenswrapper[4870]: I0130 08:09:35.891830 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 08:09:36 crc kubenswrapper[4870]: I0130 08:09:36.035225 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 14:49:43.838486054 +0000 UTC Jan 30 08:09:37 crc kubenswrapper[4870]: I0130 08:09:37.035374 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 19:29:32.926271856 +0000 UTC Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.036169 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 23:10:16.746444552 +0000 UTC Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.043318 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.043678 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.045271 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.045444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:38 crc kubenswrapper[4870]: I0130 08:09:38.045597 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.037303 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 19:51:17.725661252 +0000 UTC Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.979562 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.979825 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.981692 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.981970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.982176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:39 crc kubenswrapper[4870]: I0130 08:09:39.987068 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.037862 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 08:03:01.27298832 +0000 UTC Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.188220 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.189497 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.189568 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.189594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.843559 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.844041 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.845560 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.845629 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.845666 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:40 crc kubenswrapper[4870]: E0130 08:09:40.845726 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.849947 4870 trace.go:236] Trace[681849120]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 08:09:30.791) (total time: 10058ms): Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[681849120]: ---"Objects listed" error: 10058ms (08:09:40.849) Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[681849120]: [10.058516822s] [10.058516822s] END Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.850102 4870 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.850970 4870 trace.go:236] Trace[1249161597]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 08:09:29.080) (total time: 11770ms): Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[1249161597]: ---"Objects listed" error: 11770ms (08:09:40.850) Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[1249161597]: [11.770815313s] [11.770815313s] END Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.851018 4870 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.851856 4870 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.853315 4870 trace.go:236] Trace[1891258546]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 08:09:29.281) (total time: 11571ms): Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[1891258546]: ---"Objects listed" error: 11571ms (08:09:40.853) Jan 30 08:09:40 crc kubenswrapper[4870]: Trace[1891258546]: [11.571399396s] [11.571399396s] END Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.853347 4870 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.855315 4870 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 08:09:40 crc kubenswrapper[4870]: E0130 08:09:40.855627 4870 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.862041 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.879597 4870 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.904785 4870 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52274->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.904848 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52274->192.168.126.11:17697: read: connection reset by peer" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.904948 4870 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56072->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.904964 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56072->192.168.126.11:17697: read: connection reset by peer" Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.905154 4870 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 08:09:40 crc kubenswrapper[4870]: I0130 08:09:40.905168 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.026800 4870 apiserver.go:52] "Watching apiserver" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029047 4870 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029238 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029538 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029580 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.029602 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029538 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029783 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.029806 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.029797 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.030111 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.030143 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032065 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032137 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032063 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032393 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032532 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032617 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032767 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.032969 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.033240 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.038188 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:18:15.68667703 +0000 UTC Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.065306 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.074662 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.082710 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.093723 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.103406 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.112220 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.126864 4870 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.139964 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157051 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157285 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157366 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157432 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157479 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157553 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157635 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157704 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157769 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157832 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157648 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157821 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157886 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.157937 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158002 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158126 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158202 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158277 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158343 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158422 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158521 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158133 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158409 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158432 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158467 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158904 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159014 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159115 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159207 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159299 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159014 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159033 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159046 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159140 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159283 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159412 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.158778 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159373 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159349 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159513 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159549 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159554 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159618 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159657 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159690 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159720 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159754 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159767 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159785 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159819 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159855 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159915 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159949 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159981 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.159985 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160006 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160019 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160061 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160083 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160096 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160114 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160131 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160196 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160209 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160236 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160274 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160291 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160299 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160349 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160366 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160376 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160435 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160461 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160485 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160498 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160508 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160533 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160541 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160555 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160577 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160600 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160645 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160668 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160692 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160715 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160737 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160762 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160784 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160807 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160831 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160855 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160905 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160931 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160957 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160553 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160982 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161005 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161031 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161055 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161081 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161105 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161129 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161153 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161175 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161224 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161272 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161296 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161319 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161340 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161363 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161387 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161413 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161436 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161460 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161483 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161531 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161555 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161581 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161608 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161629 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161653 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161677 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161700 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161722 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161744 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161793 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161895 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161945 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161970 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161992 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162014 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162039 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162062 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162086 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162111 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162134 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162156 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162177 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162200 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162222 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162244 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162266 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162287 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162309 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162331 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162353 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162375 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162399 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162423 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162446 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162469 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162492 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162514 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162537 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162582 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162605 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162629 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162651 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162673 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162697 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162718 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162741 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162765 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162788 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162810 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162836 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162860 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162902 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162926 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162949 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162973 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162996 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163020 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163042 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163096 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163122 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163146 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163170 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163194 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163217 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163239 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163261 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163286 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163309 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163332 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163356 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163382 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163407 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163431 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163457 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163481 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163506 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163530 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163554 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163578 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163608 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163640 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163663 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163687 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163712 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163737 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163763 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163788 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163812 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163835 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163860 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164083 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164113 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160579 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164137 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160654 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160734 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160751 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160786 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160903 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160918 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160972 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.160983 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161040 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161236 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161283 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161348 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161406 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161447 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161466 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161563 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161574 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161639 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161821 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161810 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161822 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161820 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.161996 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162403 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.162967 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163150 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163338 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.163472 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164012 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164110 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.165173 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.165256 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.165344 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.165132 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.165760 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.166136 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:09:41.666116782 +0000 UTC m=+20.361663971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166491 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166715 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166840 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.164162 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.166942 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167121 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167214 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167263 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167299 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167335 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167370 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167418 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167458 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167495 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167529 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167564 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167638 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167682 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167750 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167800 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167839 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167900 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167948 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167985 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168020 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168053 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168089 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168124 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168201 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168237 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168279 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168374 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168430 4870 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168452 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168473 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168492 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168511 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168533 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168552 4870 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168571 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168590 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168608 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168626 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168644 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167152 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168662 4870 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167285 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167356 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167437 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167479 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167871 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167900 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.167908 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168027 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168165 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168189 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168742 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168355 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168517 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168485 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168647 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168904 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169045 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169076 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169186 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169347 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169496 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169686 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169993 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.170240 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.170526 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.170605 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.170674 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.171139 4870 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169871 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172165 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172293 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172316 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172556 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172774 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172942 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.169793 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.168683 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.172997 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173009 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173023 4870 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173034 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173054 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173065 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173075 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173085 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173095 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173105 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173115 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173110 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173127 4870 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173166 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173192 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173210 4870 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173229 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173248 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173686 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173722 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173744 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173761 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173780 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173798 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173815 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173833 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173853 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173869 4870 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173908 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173926 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173943 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173964 4870 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173984 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174003 4870 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174020 4870 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174037 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174054 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174071 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174088 4870 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174108 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174125 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174141 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174162 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174178 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174195 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174211 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174227 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174243 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174261 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174282 4870 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174301 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174319 4870 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174337 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174356 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173506 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173620 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173628 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174394 4870 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174421 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174579 4870 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174607 4870 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174625 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174643 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174660 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.173773 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174675 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174282 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174332 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174748 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.174831 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.181705 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.182071 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.182555 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.182902 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.183955 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.184784 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.185785 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186392 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186555 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186566 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186556 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186669 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186850 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186895 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.186966 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187371 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187541 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187705 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187125 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187553 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187894 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187913 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187939 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.187952 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.188227 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.188369 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.188676 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.188765 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:41.688726763 +0000 UTC m=+20.384273942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.189130 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.189178 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:41.689167967 +0000 UTC m=+20.384715166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.190696 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.194648 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.195771 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196191 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196281 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196503 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196752 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196804 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.196914 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197132 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197193 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197341 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197489 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197607 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197636 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.197796 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198192 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198195 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198286 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198324 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198608 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.198866 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.199824 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.199847 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.199860 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.199937 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:41.69991977 +0000 UTC m=+20.395466959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.200223 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.200706 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.200732 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.200745 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.200824 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:41.700789078 +0000 UTC m=+20.396336237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.201850 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.202154 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.202172 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.202555 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596" exitCode=255 Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.202628 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596"} Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.204224 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.204462 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.204482 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.204533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.204792 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.205097 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.205602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.205640 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.206411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.206796 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.207116 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.207527 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.207837 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208000 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208429 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208445 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208616 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208669 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.208849 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.210103 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.209225 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.209317 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.209741 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.211049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.211698 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.215494 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.216971 4870 scope.go:117] "RemoveContainer" containerID="6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.218230 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.223023 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.225188 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.227563 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.231704 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.234113 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.237248 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.241900 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.249729 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.262350 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.274896 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275172 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275225 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275300 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275314 4870 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275327 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275339 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275345 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275351 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275386 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275417 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275428 4870 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275437 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275446 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275455 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275464 4870 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275490 4870 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275499 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275508 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275516 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275524 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275532 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275540 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275566 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275576 4870 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275584 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275592 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275600 4870 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275609 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275617 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275626 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275652 4870 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275660 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275669 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275677 4870 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275684 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275692 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275700 4870 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275724 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275733 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275742 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275751 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275759 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275767 4870 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275774 4870 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275782 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275810 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275818 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275826 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275834 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275842 4870 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275850 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275858 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275894 4870 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275903 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275926 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275933 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275941 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275949 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275981 4870 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275989 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275997 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276007 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276014 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276023 4870 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276031 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276064 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276073 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276086 4870 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276096 4870 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276107 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276140 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276154 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276165 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276233 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276243 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276252 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276261 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276270 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276278 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276286 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276312 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276321 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276329 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276337 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276345 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276353 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276361 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276413 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276426 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276435 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276444 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276452 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276461 4870 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276469 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276496 4870 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276505 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276513 4870 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276521 4870 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276529 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276538 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276545 4870 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276553 4870 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276581 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276589 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276598 4870 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276606 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276613 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276623 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276632 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276702 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276712 4870 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276723 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276732 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276819 4870 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276827 4870 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276840 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276847 4870 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276854 4870 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276862 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276896 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276907 4870 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.276915 4870 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.275385 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.345345 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.356106 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.360915 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 08:09:41 crc kubenswrapper[4870]: W0130 08:09:41.382938 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3b44f3a0c6ed8d542aae4cf4d599e6f89992d9651f4dda83cc8f819abc975355 WatchSource:0}: Error finding container 3b44f3a0c6ed8d542aae4cf4d599e6f89992d9651f4dda83cc8f819abc975355: Status 404 returned error can't find the container with id 3b44f3a0c6ed8d542aae4cf4d599e6f89992d9651f4dda83cc8f819abc975355 Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.682320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.682488 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:09:42.682456581 +0000 UTC m=+21.378003690 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.783706 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.783741 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.783756 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:41 crc kubenswrapper[4870]: I0130 08:09:41.783784 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.783907 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.783976 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.783996 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:42.783975477 +0000 UTC m=+21.479522646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.783922 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784045 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784056 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784058 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:42.784029559 +0000 UTC m=+21.479576738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784057 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784099 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784104 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:42.784090511 +0000 UTC m=+21.479637620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784112 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:41 crc kubenswrapper[4870]: E0130 08:09:41.784161 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:42.784146363 +0000 UTC m=+21.479693472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.038718 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 07:14:12.302701093 +0000 UTC Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.077863 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.078522 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.079756 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.080470 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.081634 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.082161 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.082724 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.083604 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.084364 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.085339 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.085801 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.086834 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.087335 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.087836 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.088730 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.089570 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.090186 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.090553 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.091153 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.091697 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.092259 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.094076 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.094539 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.095633 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.096198 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.096872 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.098025 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.098115 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.098668 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.099783 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.100531 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.101625 4870 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.101829 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.103844 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.104578 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.105659 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.107585 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.108380 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.109357 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.110091 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.111203 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.111510 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.111735 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.112930 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.113549 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.114622 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.115198 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.116178 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.116782 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.117886 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.118432 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.119286 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.119785 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.120750 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.121350 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.121856 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.123832 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.137761 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.152952 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.165231 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.174040 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.175050 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.177378 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.191295 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.198687 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.206616 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.206667 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.206682 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3b44f3a0c6ed8d542aae4cf4d599e6f89992d9651f4dda83cc8f819abc975355"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.207575 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.207603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3c30a7b979634db9749c244abf7dcb6c275e2ad3259168538d5b6c5a75831f69"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.208558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d21ebb539502ece16466d8085148cb62d450eaedc66f40648913f75628df180e"} Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.209305 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.210354 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.212264 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c"} Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.216367 4870 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.217628 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.226051 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.239757 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.248777 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.256708 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.264750 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.277061 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.289650 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.310709 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.325818 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.338316 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.350601 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.369889 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.385548 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.399053 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.412679 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.424914 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.694715 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.694928 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:09:44.694903783 +0000 UTC m=+23.390450902 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.796142 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.796245 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.796287 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:42 crc kubenswrapper[4870]: I0130 08:09:42.796317 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796420 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796476 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796478 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796502 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796526 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796509 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:44.796489611 +0000 UTC m=+23.492036730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796590 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:44.796569254 +0000 UTC m=+23.492116413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796623 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:44.796612525 +0000 UTC m=+23.492159714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796676 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796705 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796727 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:42 crc kubenswrapper[4870]: E0130 08:09:42.796800 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:44.79677678 +0000 UTC m=+23.492323969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:43 crc kubenswrapper[4870]: I0130 08:09:43.039394 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:41:48.435506103 +0000 UTC Jan 30 08:09:43 crc kubenswrapper[4870]: I0130 08:09:43.074086 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:43 crc kubenswrapper[4870]: I0130 08:09:43.074152 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:43 crc kubenswrapper[4870]: I0130 08:09:43.074196 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:43 crc kubenswrapper[4870]: E0130 08:09:43.074597 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:43 crc kubenswrapper[4870]: E0130 08:09:43.074424 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:43 crc kubenswrapper[4870]: E0130 08:09:43.074669 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:43 crc kubenswrapper[4870]: I0130 08:09:43.214618 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.040808 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 22:24:41.352140521 +0000 UTC Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.219380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae"} Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.239328 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.254958 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.274363 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.297634 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.326025 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.345769 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.362980 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.386850 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.403127 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.711265 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.711539 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:09:48.711501744 +0000 UTC m=+27.407048903 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.812839 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.812944 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.812995 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:44 crc kubenswrapper[4870]: I0130 08:09:44.813057 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813165 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813250 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:48.813229197 +0000 UTC m=+27.508776306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813271 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813343 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813344 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813373 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813290 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813442 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:48.813429273 +0000 UTC m=+27.508976382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813483 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:48.813449894 +0000 UTC m=+27.508997083 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813499 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813536 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:44 crc kubenswrapper[4870]: E0130 08:09:44.813642 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:48.813606698 +0000 UTC m=+27.509153847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:45 crc kubenswrapper[4870]: I0130 08:09:45.041294 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:06:09.912955788 +0000 UTC Jan 30 08:09:45 crc kubenswrapper[4870]: I0130 08:09:45.074571 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:45 crc kubenswrapper[4870]: I0130 08:09:45.074630 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:45 crc kubenswrapper[4870]: E0130 08:09:45.074680 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:45 crc kubenswrapper[4870]: I0130 08:09:45.074649 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:45 crc kubenswrapper[4870]: E0130 08:09:45.074809 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:45 crc kubenswrapper[4870]: E0130 08:09:45.074995 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:46 crc kubenswrapper[4870]: I0130 08:09:46.041429 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:06:46.117266408 +0000 UTC Jan 30 08:09:46 crc kubenswrapper[4870]: I0130 08:09:46.949926 4870 csr.go:261] certificate signing request csr-bt6jw is approved, waiting to be issued Jan 30 08:09:46 crc kubenswrapper[4870]: I0130 08:09:46.960329 4870 csr.go:257] certificate signing request csr-bt6jw is issued Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.042204 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 12:52:38.454768013 +0000 UTC Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.073779 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.073819 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.073921 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.073925 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.074030 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.074140 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.256482 4870 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.259183 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.259243 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.259257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.259344 4870 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.271943 4870 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.272076 4870 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.273605 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.273659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.273674 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.273693 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.273707 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.340360 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8kvt7"] Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.340846 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.345626 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.345832 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.347034 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.356967 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.363248 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.363295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.363307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.363327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.363339 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.392371 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.412541 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.417813 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.417858 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.417867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.418125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.418142 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.433155 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.438696 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.438934 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzdvc\" (UniqueName: \"kubernetes.io/projected/1239efc2-d4e8-4a88-a0bf-00a685812999-kube-api-access-bzdvc\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.439021 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1239efc2-d4e8-4a88-a0bf-00a685812999-hosts-file\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.445841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.445904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.445917 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.445935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.445949 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.448402 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.455980 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.459052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.459089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.459096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.459110 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.459120 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.461992 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.471692 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: E0130 08:09:47.471853 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.473816 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.473856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.473867 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.473905 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.473920 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.476513 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.495491 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.507262 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.520563 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.536040 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.540367 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdvc\" (UniqueName: \"kubernetes.io/projected/1239efc2-d4e8-4a88-a0bf-00a685812999-kube-api-access-bzdvc\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.540411 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1239efc2-d4e8-4a88-a0bf-00a685812999-hosts-file\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.540515 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1239efc2-d4e8-4a88-a0bf-00a685812999-hosts-file\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.554764 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.557902 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdvc\" (UniqueName: \"kubernetes.io/projected/1239efc2-d4e8-4a88-a0bf-00a685812999-kube-api-access-bzdvc\") pod \"node-resolver-8kvt7\" (UID: \"1239efc2-d4e8-4a88-a0bf-00a685812999\") " pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.576259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.576315 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.576327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.576351 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.576362 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.654640 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8kvt7" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.679802 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.680062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.680149 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.680227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.680307 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.775184 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j4sd8"] Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.775464 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rrkfz"] Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.775929 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hsmrb"] Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.776171 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.776512 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.776713 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.781663 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782015 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782032 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782221 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782363 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782700 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.782918 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.786402 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.786422 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.786510 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.786581 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.786605 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.793946 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.793983 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.793994 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.794015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.794027 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.801766 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.813803 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.828411 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.849659 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.881962 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.895995 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.896024 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.896032 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.896044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.896052 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.897619 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.907855 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.917909 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.926125 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.938029 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945237 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-etc-kubernetes\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945385 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-hostroot\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945504 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cnibin\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945618 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-proxy-tls\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-os-release\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.945861 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-netns\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946003 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-multus\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946108 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-system-cni-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946208 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-cnibin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946317 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4df\" (UniqueName: \"kubernetes.io/projected/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-kube-api-access-hp4df\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946354 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwsvv\" (UniqueName: \"kubernetes.io/projected/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-kube-api-access-jwsvv\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946375 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-cni-binary-copy\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946395 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-daemon-config\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946425 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946445 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-system-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946463 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946484 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946505 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946526 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-socket-dir-parent\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946545 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-multus-certs\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946565 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-os-release\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946584 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-bin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946606 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-kubelet\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946628 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-rootfs\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946664 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-k8s-cni-cncf-io\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946688 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-mcd-auth-proxy-config\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946709 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-conf-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.946731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7mcd\" (UniqueName: \"kubernetes.io/projected/3e8e9e25-2b9b-4820-8282-48e1d930a721-kube-api-access-k7mcd\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.950562 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.961547 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 08:04:46 +0000 UTC, rotation deadline is 2026-11-30 12:25:13.515157138 +0000 UTC Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.961749 4870 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7300h15m25.553411431s for next certificate rotation Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.971777 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.993762 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:47Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.998056 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.998109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.998125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.998148 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:47 crc kubenswrapper[4870]: I0130 08:09:47.998164 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:47Z","lastTransitionTime":"2026-01-30T08:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.015302 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.031452 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.042935 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 13:02:21.699214919 +0000 UTC Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047308 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4df\" (UniqueName: \"kubernetes.io/projected/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-kube-api-access-hp4df\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047338 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwsvv\" (UniqueName: \"kubernetes.io/projected/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-kube-api-access-jwsvv\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047356 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-cni-binary-copy\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047374 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-daemon-config\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047390 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047405 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-system-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047419 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047436 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047453 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047469 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-socket-dir-parent\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047485 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-multus-certs\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047522 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-os-release\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047537 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-bin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047551 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-kubelet\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-rootfs\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-k8s-cni-cncf-io\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047610 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-mcd-auth-proxy-config\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047624 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-conf-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047640 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7mcd\" (UniqueName: \"kubernetes.io/projected/3e8e9e25-2b9b-4820-8282-48e1d930a721-kube-api-access-k7mcd\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047655 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-etc-kubernetes\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047675 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-hostroot\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047688 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cnibin\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047721 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-proxy-tls\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047737 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-os-release\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047761 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-netns\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047774 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-multus\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047772 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-os-release\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047791 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-system-cni-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047844 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-bin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-cnibin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047892 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-kubelet\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047938 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-conf-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048107 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-cni-binary-copy\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047297 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.047825 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-system-cni-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048340 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-netns\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048363 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-os-release\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048570 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-hostroot\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048586 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-etc-kubernetes\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048601 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-rootfs\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048616 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cnibin\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048635 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-cnibin\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048652 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-var-lib-cni-multus\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048666 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-k8s-cni-cncf-io\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048701 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048805 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-mcd-auth-proxy-config\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.048958 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-system-cni-dir\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049009 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-socket-dir-parent\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049100 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3e8e9e25-2b9b-4820-8282-48e1d930a721-multus-daemon-config\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049103 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3e8e9e25-2b9b-4820-8282-48e1d930a721-host-run-multus-certs\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049175 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.049638 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.052498 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-proxy-tls\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.061336 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.063053 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwsvv\" (UniqueName: \"kubernetes.io/projected/8bdd7f5e-1187-4760-b2dc-98c3d3286f05-kube-api-access-jwsvv\") pod \"multus-additional-cni-plugins-rrkfz\" (UID: \"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\") " pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.069422 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7mcd\" (UniqueName: \"kubernetes.io/projected/3e8e9e25-2b9b-4820-8282-48e1d930a721-kube-api-access-k7mcd\") pod \"multus-hsmrb\" (UID: \"3e8e9e25-2b9b-4820-8282-48e1d930a721\") " pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.073057 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4df\" (UniqueName: \"kubernetes.io/projected/5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d-kube-api-access-hp4df\") pod \"machine-config-daemon-j4sd8\" (UID: \"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\") " pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.073207 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.086971 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.095038 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hsmrb" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.100630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.100665 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.100676 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.100692 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.100704 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: W0130 08:09:48.105834 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e8e9e25_2b9b_4820_8282_48e1d930a721.slice/crio-c933778356f15b59a6baac7e8f424762da44bd2e5b2a9ec9b779401b6738da10 WatchSource:0}: Error finding container c933778356f15b59a6baac7e8f424762da44bd2e5b2a9ec9b779401b6738da10: Status 404 returned error can't find the container with id c933778356f15b59a6baac7e8f424762da44bd2e5b2a9ec9b779401b6738da10 Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.109149 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.120770 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.141652 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.152959 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cj5db"] Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.153854 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.156237 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.156681 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.156845 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.156993 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.156992 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.157097 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.157181 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.157837 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.169610 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.172214 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.180137 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.182953 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: W0130 08:09:48.183727 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3c8db6_cf22_4fb2_ae7c_a3d544473a6d.slice/crio-fa286fd8c7032ffeaac6381a2f5eba18216372d858ee8acb5cf5ec3f949c8689 WatchSource:0}: Error finding container fa286fd8c7032ffeaac6381a2f5eba18216372d858ee8acb5cf5ec3f949c8689: Status 404 returned error can't find the container with id fa286fd8c7032ffeaac6381a2f5eba18216372d858ee8acb5cf5ec3f949c8689 Jan 30 08:09:48 crc kubenswrapper[4870]: W0130 08:09:48.191240 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdd7f5e_1187_4760_b2dc_98c3d3286f05.slice/crio-d67af729ffbb403137a43034324451b234a181eb60b2891191e6ddfad4b7cdb9 WatchSource:0}: Error finding container d67af729ffbb403137a43034324451b234a181eb60b2891191e6ddfad4b7cdb9: Status 404 returned error can't find the container with id d67af729ffbb403137a43034324451b234a181eb60b2891191e6ddfad4b7cdb9 Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.196377 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.202787 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.202808 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.202815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.202828 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.202837 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.219173 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.233835 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerStarted","Data":"d67af729ffbb403137a43034324451b234a181eb60b2891191e6ddfad4b7cdb9"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.234750 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"fa286fd8c7032ffeaac6381a2f5eba18216372d858ee8acb5cf5ec3f949c8689"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.236599 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerStarted","Data":"f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.236748 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerStarted","Data":"c933778356f15b59a6baac7e8f424762da44bd2e5b2a9ec9b779401b6738da10"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.236730 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.237832 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8kvt7" event={"ID":"1239efc2-d4e8-4a88-a0bf-00a685812999","Type":"ContainerStarted","Data":"82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.237866 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8kvt7" event={"ID":"1239efc2-d4e8-4a88-a0bf-00a685812999","Type":"ContainerStarted","Data":"06e590ba7580dcd49d52849d81c5649ecddac56b8bf02da40b3d5d3017c53a04"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.250844 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251046 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251205 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251344 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251461 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251572 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251656 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251727 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251801 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.251898 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252101 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252181 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5ps\" (UniqueName: \"kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252279 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252352 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252430 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252521 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252598 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252690 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252794 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.252904 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.253194 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.270608 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.287188 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.299041 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.305076 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.305120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.305132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.305149 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.305161 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.311054 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.323420 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.349511 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354226 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354278 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354302 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354325 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354346 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354365 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354404 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354427 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354450 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5ps\" (UniqueName: \"kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354475 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354498 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354511 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354521 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354541 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354561 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354587 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354595 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354611 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354632 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354693 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354728 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354751 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354781 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.354835 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355366 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355397 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355366 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355767 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355901 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355949 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355967 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.355983 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.356017 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.356044 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.356055 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.356373 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.356984 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.357027 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.361330 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.368401 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.377852 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5ps\" (UniqueName: \"kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps\") pod \"ovnkube-node-cj5db\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.379475 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.389646 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.400868 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.408242 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.408290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.408300 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.408313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.408321 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.415094 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.430300 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.443475 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.456094 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.464156 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.468943 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: W0130 08:09:48.475430 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36037609_52f9_4c09_8beb_6d35a039347b.slice/crio-cb12165531731a176212e5ceb871fbc54aec2538e3ad27d93d5c0438cf177aa7 WatchSource:0}: Error finding container cb12165531731a176212e5ceb871fbc54aec2538e3ad27d93d5c0438cf177aa7: Status 404 returned error can't find the container with id cb12165531731a176212e5ceb871fbc54aec2538e3ad27d93d5c0438cf177aa7 Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.483181 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.496305 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.507453 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.510106 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.510146 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.510156 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.510172 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.510182 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.518546 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.538437 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.550210 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.567312 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.580113 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.595553 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.606689 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:48Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.612449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.612472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.612481 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.612495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.612504 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.714953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.714997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.715010 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.715025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.715036 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.758730 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.758837 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:09:56.758814464 +0000 UTC m=+35.454361573 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.817847 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.817962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.817982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.818005 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.818017 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.859738 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.859807 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.859831 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.859852 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.859946 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.859978 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860003 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860016 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860085 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860019 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:56.860002639 +0000 UTC m=+35.555549748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.859984 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860121 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:56.860101372 +0000 UTC m=+35.555648551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860136 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860140 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:56.860130653 +0000 UTC m=+35.555677762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860156 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:48 crc kubenswrapper[4870]: E0130 08:09:48.860221 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:09:56.860204856 +0000 UTC m=+35.555752065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.920649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.920706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.920718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.920738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:48 crc kubenswrapper[4870]: I0130 08:09:48.920751 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:48Z","lastTransitionTime":"2026-01-30T08:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.027684 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.027738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.027752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.027774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.027786 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.043849 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 17:16:16.586843725 +0000 UTC Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.074166 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.074232 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.074177 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:49 crc kubenswrapper[4870]: E0130 08:09:49.074357 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:49 crc kubenswrapper[4870]: E0130 08:09:49.074521 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:49 crc kubenswrapper[4870]: E0130 08:09:49.074726 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.130426 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.130462 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.130471 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.130487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.130496 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.233269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.233339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.233352 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.233376 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.233391 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.243593 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.243667 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.249563 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f" exitCode=0 Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.249647 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.249719 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"cb12165531731a176212e5ceb871fbc54aec2538e3ad27d93d5c0438cf177aa7"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.251828 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7" exitCode=0 Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.253149 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.264675 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.294706 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.319038 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.334794 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.336147 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.336197 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.336208 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.336224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.336236 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.347354 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.358195 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.372155 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.387998 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.409636 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.434768 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.439598 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.439662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.439675 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.439694 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.439704 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.451315 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.465923 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.478319 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.492328 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.505305 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.516023 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.537236 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.541767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.541811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.541829 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.541846 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.541855 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.550170 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.563013 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.573313 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.589898 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.617843 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.631041 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.644773 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.645128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.645198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.645274 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.645341 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.646376 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.660657 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.671245 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.683141 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.693074 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:49Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.747744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.747956 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.748041 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.748113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.748179 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.852438 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.852482 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.852494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.852517 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.852529 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.958182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.958229 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.958240 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.958294 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:49 crc kubenswrapper[4870]: I0130 08:09:49.958306 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:49Z","lastTransitionTime":"2026-01-30T08:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.044924 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:07:29.764998287 +0000 UTC Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.061263 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.061302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.061314 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.061330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.061343 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.163963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.164000 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.164009 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.164025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.164034 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.257262 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.257304 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.257325 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.257336 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.257348 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.260116 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerStarted","Data":"d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.266379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.266518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.266589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.266667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.266740 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.285049 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.299935 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.313265 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.326416 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.340919 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.357510 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.369589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.369645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.369656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.369673 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.369694 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.374637 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.396408 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dpj7j"] Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.396769 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.398713 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.399125 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.399137 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.399148 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.407370 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.421086 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.435401 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.448497 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.463426 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.472018 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.472047 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.472058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.472073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.472092 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.481148 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.494663 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.512110 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.527398 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.540970 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.555704 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.567202 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.574609 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.574647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.574655 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.574670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.574680 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.575583 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/228f8bf9-7e75-4886-8441-57bc0d251413-host\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.575826 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/228f8bf9-7e75-4886-8441-57bc0d251413-serviceca\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.576051 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dtx\" (UniqueName: \"kubernetes.io/projected/228f8bf9-7e75-4886-8441-57bc0d251413-kube-api-access-x9dtx\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.579501 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.594728 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.606990 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.624589 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.638649 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.654328 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.676910 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/228f8bf9-7e75-4886-8441-57bc0d251413-host\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.676960 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/228f8bf9-7e75-4886-8441-57bc0d251413-serviceca\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677004 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dtx\" (UniqueName: \"kubernetes.io/projected/228f8bf9-7e75-4886-8441-57bc0d251413-kube-api-access-x9dtx\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677004 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/228f8bf9-7e75-4886-8441-57bc0d251413-host\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677081 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677118 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.677136 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.678736 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/228f8bf9-7e75-4886-8441-57bc0d251413-serviceca\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.691507 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.719988 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dtx\" (UniqueName: \"kubernetes.io/projected/228f8bf9-7e75-4886-8441-57bc0d251413-kube-api-access-x9dtx\") pod \"node-ca-dpj7j\" (UID: \"228f8bf9-7e75-4886-8441-57bc0d251413\") " pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.754760 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.780713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.780749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.780761 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.780776 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.780788 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.790401 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.828822 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:50Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.882817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.882865 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.882921 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.882938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.882951 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.986811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.986868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.986913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.986940 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:50 crc kubenswrapper[4870]: I0130 08:09:50.986958 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:50Z","lastTransitionTime":"2026-01-30T08:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.010033 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dpj7j" Jan 30 08:09:51 crc kubenswrapper[4870]: W0130 08:09:51.029540 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228f8bf9_7e75_4886_8441_57bc0d251413.slice/crio-59a39a83ea2a993841daf6a190fb3bb2d1d7c07fc4b1de61267f753f3336c0ad WatchSource:0}: Error finding container 59a39a83ea2a993841daf6a190fb3bb2d1d7c07fc4b1de61267f753f3336c0ad: Status 404 returned error can't find the container with id 59a39a83ea2a993841daf6a190fb3bb2d1d7c07fc4b1de61267f753f3336c0ad Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.045377 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:35:27.32303976 +0000 UTC Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.074024 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.074108 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:51 crc kubenswrapper[4870]: E0130 08:09:51.074155 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.074185 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:51 crc kubenswrapper[4870]: E0130 08:09:51.074313 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:51 crc kubenswrapper[4870]: E0130 08:09:51.074408 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.090796 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.090840 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.090851 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.090866 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.090905 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.193799 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.193831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.193844 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.193863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.193883 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.264350 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dpj7j" event={"ID":"228f8bf9-7e75-4886-8441-57bc0d251413","Type":"ContainerStarted","Data":"59a39a83ea2a993841daf6a190fb3bb2d1d7c07fc4b1de61267f753f3336c0ad"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.268315 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f" exitCode=0 Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.268383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.275084 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.287181 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.295662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.295696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.295707 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.295723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.295734 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.300872 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.315902 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.329857 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.344408 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.356421 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.383080 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.396178 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.398064 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.398090 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.398099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.398132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.398160 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.409790 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.422710 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.434770 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.448779 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.459049 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.484453 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.497562 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.500786 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.500824 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.500831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.500847 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.500857 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.604590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.604633 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.604643 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.604660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.604672 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.707845 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.707907 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.707918 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.707936 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.707947 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.811378 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.811439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.811453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.811479 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.811497 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.841745 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.849477 4870 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.854133 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/pods/multus-additional-cni-plugins-rrkfz/status\": read tcp 38.129.56.227:58122->38.129.56.227:6443: use of closed network connection" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.894399 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.907405 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.916285 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.916327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.916338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.916352 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.916363 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:51Z","lastTransitionTime":"2026-01-30T08:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.921158 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.935105 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.946243 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.958034 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.970672 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:51 crc kubenswrapper[4870]: I0130 08:09:51.989632 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.000513 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:51Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.011989 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.018733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.018760 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.018769 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.018781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.018791 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.021760 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.032684 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.041618 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.046389 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:17:20.036425753 +0000 UTC Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.049488 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.088263 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.121446 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.121506 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.121525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.121549 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.121569 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.127339 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.155135 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.195764 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.224088 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.224122 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.224131 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.224144 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.224155 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.239556 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.272436 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.282195 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63" exitCode=0 Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.282259 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.285197 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dpj7j" event={"ID":"228f8bf9-7e75-4886-8441-57bc0d251413","Type":"ContainerStarted","Data":"872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.312239 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.326511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.326705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.326791 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.326859 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.326945 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.353913 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.436793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.436837 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.436850 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.436868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.436900 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.442767 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.454170 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.472259 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.516486 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.539407 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.539442 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.539450 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.539464 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.539473 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.551351 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.592248 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.628761 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.641525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.641564 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.641574 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.641589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.641600 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.672325 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.715498 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.744414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.744468 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.744483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.744501 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.744513 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.750529 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.793710 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.832290 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.846413 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.846461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.846477 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.846501 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.846516 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.870128 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.919621 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.949674 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.949731 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.949745 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.949767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.949782 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:52Z","lastTransitionTime":"2026-01-30T08:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.956185 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:52 crc kubenswrapper[4870]: I0130 08:09:52.997943 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:52Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.032668 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.046548 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:30:11.475967264 +0000 UTC Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.052551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.052962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.052985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.053009 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.053027 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.074279 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:53 crc kubenswrapper[4870]: E0130 08:09:53.074409 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.074724 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:53 crc kubenswrapper[4870]: E0130 08:09:53.074787 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.074826 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:53 crc kubenswrapper[4870]: E0130 08:09:53.074864 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.078948 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.112449 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.153649 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.155116 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.155160 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.155174 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.155191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.155204 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.195998 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.231102 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.257753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.257809 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.257841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.257869 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.257899 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.293675 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e" exitCode=0 Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.293741 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.314170 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.318186 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.341427 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.357161 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.362654 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.362680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.362688 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.362701 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.362709 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.393216 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.431862 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.465832 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.465859 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.465866 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.465892 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.465901 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.470982 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.518141 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.550654 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.568538 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.568565 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.568575 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.568603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.568612 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.602645 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.635145 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.675702 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.675738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.675746 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.675759 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.675769 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.687343 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.710836 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.750091 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.778793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.778834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.778865 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.779114 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.779144 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.791070 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.828779 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:53Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.881783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.881841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.881856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.881892 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.881905 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.984635 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.984692 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.984707 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.984723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:53 crc kubenswrapper[4870]: I0130 08:09:53.984735 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:53Z","lastTransitionTime":"2026-01-30T08:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.046815 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:35:59.965153137 +0000 UTC Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.086696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.086732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.086743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.086756 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.086770 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.189000 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.189030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.189038 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.189050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.189060 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.292241 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.292295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.292311 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.292329 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.292340 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.319374 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerStarted","Data":"05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.337081 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.357680 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.371051 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.385263 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.395252 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.395297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.395339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.395360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.395373 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.397750 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.408120 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.420535 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.431467 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.449519 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.464525 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.476049 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.485441 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.497912 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.498373 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.498423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.498439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.498457 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.498472 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.512450 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.527615 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:54Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.601335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.601596 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.601605 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.601620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.601629 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.704301 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.704434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.704499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.704577 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.705682 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.809103 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.809168 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.809191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.809223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.809263 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.910875 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.910938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.910952 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.910970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:54 crc kubenswrapper[4870]: I0130 08:09:54.910981 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:54Z","lastTransitionTime":"2026-01-30T08:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.013079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.013130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.013142 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.013159 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.013171 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.047752 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:11:35.934798055 +0000 UTC Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.073572 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.073623 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.073614 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:55 crc kubenswrapper[4870]: E0130 08:09:55.073735 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:55 crc kubenswrapper[4870]: E0130 08:09:55.073997 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:55 crc kubenswrapper[4870]: E0130 08:09:55.074165 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.116135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.116175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.116186 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.116202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.116213 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.219138 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.219212 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.219230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.219267 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.219288 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.322588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.322662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.322685 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.322715 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.322738 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.336415 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.336798 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.343688 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f" exitCode=0 Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.343753 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.360358 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.377076 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.390333 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.391207 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.404449 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.420592 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.426070 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.426102 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.426114 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.426131 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.426142 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.433036 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.443593 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.465598 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.482709 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.494635 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.507723 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.520200 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.529379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.529536 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.529617 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.529701 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.529787 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.532352 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.542821 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.552658 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.564586 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.581545 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.592196 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.604270 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.614961 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.624695 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.631586 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.631611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.631620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.631632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.631643 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.636552 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.677340 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.697214 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.707627 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.717535 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.733968 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.734016 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.734026 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.734044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.734055 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.738353 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.749619 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.761592 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.778106 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:55Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.836298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.836363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.836376 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.836413 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.836425 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.939509 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.939585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.939625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.939657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:55 crc kubenswrapper[4870]: I0130 08:09:55.939678 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:55Z","lastTransitionTime":"2026-01-30T08:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.042728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.042788 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.042800 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.042835 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.042848 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.047922 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:58:52.429483404 +0000 UTC Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.145748 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.145836 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.145854 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.145906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.145928 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.248967 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.249052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.249071 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.249095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.249139 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.350912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.350964 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.351012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.351033 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.351048 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.352928 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bdd7f5e-1187-4760-b2dc-98c3d3286f05" containerID="7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1" exitCode=0 Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.353121 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.353274 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerDied","Data":"7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.358266 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.374605 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.392132 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.394053 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.403520 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.427702 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.440410 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453717 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453821 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.453868 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.465183 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.479061 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.491105 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.507729 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.527667 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.543301 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556048 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556155 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.556254 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.568368 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.581182 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.593476 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.604811 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.615521 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.623929 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.639651 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.649841 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659138 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659178 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659189 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659205 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659217 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.659734 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.675857 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.689080 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.703046 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.720514 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.736026 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.753128 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.762222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.762264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.762278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.762298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.762312 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.767215 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.767347 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:10:12.767323 +0000 UTC m=+51.462870129 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.791973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.833589 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:56Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.865404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.865443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.865455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.865472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.865485 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.868083 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.868124 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.868160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.868182 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868201 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868268 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:12.868248347 +0000 UTC m=+51.563795466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868289 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868308 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868319 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868361 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:12.86834658 +0000 UTC m=+51.563893699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868358 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868401 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868415 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868493 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:12.868472294 +0000 UTC m=+51.564019443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868422 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:56 crc kubenswrapper[4870]: E0130 08:09:56.868566 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:12.868550846 +0000 UTC m=+51.564098005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.968033 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.968078 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.968094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.968110 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:56 crc kubenswrapper[4870]: I0130 08:09:56.968123 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:56Z","lastTransitionTime":"2026-01-30T08:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.048792 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:57:52.909350987 +0000 UTC Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.072667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.072739 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.072764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.072794 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.072817 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.074006 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.074193 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.074806 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.074994 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.075116 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.075253 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.175473 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.175509 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.175520 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.175541 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.175550 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.277186 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.277220 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.277231 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.277244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.277252 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.359551 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" event={"ID":"8bdd7f5e-1187-4760-b2dc-98c3d3286f05","Type":"ContainerStarted","Data":"b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.359653 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.373370 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.379516 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.379562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.379577 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.379593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.379603 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.390553 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.418976 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.434243 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.448018 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.457084 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.470595 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.481454 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.481494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.481505 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.481519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.481529 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.482082 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.490257 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.506675 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.519282 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.532498 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.544202 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.547414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.547457 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.547469 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.547486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.547497 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.557353 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.559655 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.562856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.562924 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.562938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.562957 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.562970 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.568982 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.574289 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.578023 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.578058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.578066 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.578080 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.578089 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.590135 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.593652 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.593672 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.593681 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.593695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.593705 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.606374 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.609270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.609308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.609316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.609354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.609364 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.631784 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:57Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:57 crc kubenswrapper[4870]: E0130 08:09:57.631921 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.633614 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.633640 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.633649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.633662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.633671 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.736619 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.736686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.736696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.736710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.736718 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.838648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.838674 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.838682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.838695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.838705 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.940871 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.940982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.941012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.941040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:57 crc kubenswrapper[4870]: I0130 08:09:57.941058 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:57Z","lastTransitionTime":"2026-01-30T08:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.043528 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.043563 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.043572 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.043586 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.043595 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.049034 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:31:50.827947012 +0000 UTC Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.146571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.146632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.146650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.146673 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.146690 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.249985 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.250056 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.250075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.250098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.250115 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.353743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.353823 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.353841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.353868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.353953 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.366949 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/0.log" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.371446 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0" exitCode=1 Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.371678 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.372459 4870 scope.go:117] "RemoveContainer" containerID="4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.397292 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.433439 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"d *v1.Pod event handler 3\\\\nI0130 08:09:57.802952 6155 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803083 6155 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 08:09:57.803225 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803438 6155 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 08:09:57.803479 6155 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 08:09:57.803490 6155 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 08:09:57.803503 6155 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:57.803541 6155 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 08:09:57.803558 6155 factory.go:656] Stopping watch factory\\\\nI0130 08:09:57.803557 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 08:09:57.803574 6155 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:57.803568 6155 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.451979 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.457176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.457234 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.457253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.457280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.457296 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.467461 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.484333 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.541342 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561596 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561952 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561967 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.561979 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.574599 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.590201 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.621197 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.637259 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.652684 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.664245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.664278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.664286 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.664299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.664308 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.667380 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.681462 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.695836 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:58Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.767249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.767305 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.767317 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.767337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.767350 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.869705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.869757 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.869770 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.869789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.869801 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.972387 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.972420 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.972430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.972443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:58 crc kubenswrapper[4870]: I0130 08:09:58.972451 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:58Z","lastTransitionTime":"2026-01-30T08:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.049739 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 22:56:24.089448273 +0000 UTC Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.073613 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.073613 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:09:59 crc kubenswrapper[4870]: E0130 08:09:59.073726 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:09:59 crc kubenswrapper[4870]: E0130 08:09:59.073831 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.073615 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:09:59 crc kubenswrapper[4870]: E0130 08:09:59.074044 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.075038 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.075076 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.075087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.075103 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.075116 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.177119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.177190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.177203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.177219 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.177231 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.279079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.279113 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.279126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.279147 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.279158 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.377958 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/0.log" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.381489 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.381700 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.382253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.382292 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.382308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.382325 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.382338 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.401851 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.417290 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.431752 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.444614 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.459023 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.470071 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.484838 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.484916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.484932 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.484966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.484984 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.492221 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.506825 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.526345 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"d *v1.Pod event handler 3\\\\nI0130 08:09:57.802952 6155 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803083 6155 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 08:09:57.803225 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803438 6155 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 08:09:57.803479 6155 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 08:09:57.803490 6155 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 08:09:57.803503 6155 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:57.803541 6155 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 08:09:57.803558 6155 factory.go:656] Stopping watch factory\\\\nI0130 08:09:57.803557 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 08:09:57.803574 6155 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:57.803568 6155 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.538038 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.549549 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.561333 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.576009 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.587527 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.587584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.587602 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.587625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.587642 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.590474 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.601083 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.681350 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq"] Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.682178 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.685476 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.686046 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.689529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.689564 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.689577 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.689595 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.689609 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.694170 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.694238 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.694444 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglbm\" (UniqueName: \"kubernetes.io/projected/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-kube-api-access-fglbm\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.694536 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.698025 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.709197 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.721547 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.742206 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.756099 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.772055 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.783835 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.792764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.792832 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.792856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.792922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.792955 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.795298 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.795357 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.795386 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.795457 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglbm\" (UniqueName: \"kubernetes.io/projected/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-kube-api-access-fglbm\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.796026 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.796495 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.798728 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.805127 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.812849 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglbm\" (UniqueName: \"kubernetes.io/projected/b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8-kube-api-access-fglbm\") pod \"ovnkube-control-plane-749d76644c-q5xdq\" (UID: \"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.824759 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.840366 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.857916 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.885299 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"d *v1.Pod event handler 3\\\\nI0130 08:09:57.802952 6155 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803083 6155 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 08:09:57.803225 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803438 6155 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 08:09:57.803479 6155 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 08:09:57.803490 6155 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 08:09:57.803503 6155 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:57.803541 6155 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 08:09:57.803558 6155 factory.go:656] Stopping watch factory\\\\nI0130 08:09:57.803557 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 08:09:57.803574 6155 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:57.803568 6155 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.896123 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.896175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.896195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.896219 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.896237 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.900968 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.921207 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.940607 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.963329 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:09:59Z is after 2025-08-24T17:21:41Z" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.995091 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.998448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.998633 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.998754 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.998880 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:09:59 crc kubenswrapper[4870]: I0130 08:09:59.999021 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:09:59Z","lastTransitionTime":"2026-01-30T08:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: W0130 08:10:00.014727 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb66c5f2c_1e0e_4d09_ab12_8cd255f29aa8.slice/crio-6666711ec23bba216eebd7fab9fe80d59045c0400d3fd8b44279e456d40e7f83 WatchSource:0}: Error finding container 6666711ec23bba216eebd7fab9fe80d59045c0400d3fd8b44279e456d40e7f83: Status 404 returned error can't find the container with id 6666711ec23bba216eebd7fab9fe80d59045c0400d3fd8b44279e456d40e7f83 Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.050134 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:11:27.685869281 +0000 UTC Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.101431 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.101477 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.101494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.101526 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.101549 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.204138 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.204196 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.204218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.204247 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.204270 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.306839 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.307115 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.307203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.307287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.307366 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.388556 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/1.log" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.389422 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/0.log" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.392525 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" exitCode=1 Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.392624 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.392701 4870 scope.go:117] "RemoveContainer" containerID="4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.393930 4870 scope.go:117] "RemoveContainer" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" Jan 30 08:10:00 crc kubenswrapper[4870]: E0130 08:10:00.394227 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.397921 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" event={"ID":"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8","Type":"ContainerStarted","Data":"7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.397970 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" event={"ID":"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8","Type":"ContainerStarted","Data":"ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.397986 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" event={"ID":"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8","Type":"ContainerStarted","Data":"6666711ec23bba216eebd7fab9fe80d59045c0400d3fd8b44279e456d40e7f83"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.408247 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.409196 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.409232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.409242 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.409256 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.409265 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.424801 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.440398 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.459793 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.476409 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.493068 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512137 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512187 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512205 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512218 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.512205 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.522610 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.535333 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.555452 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"d *v1.Pod event handler 3\\\\nI0130 08:09:57.802952 6155 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803083 6155 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 08:09:57.803225 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803438 6155 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 08:09:57.803479 6155 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 08:09:57.803490 6155 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 08:09:57.803503 6155 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:57.803541 6155 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 08:09:57.803558 6155 factory.go:656] Stopping watch factory\\\\nI0130 08:09:57.803557 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 08:09:57.803574 6155 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:57.803568 6155 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.567616 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.581778 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.594169 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.610413 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.614111 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.614146 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.614157 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.614173 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.614184 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.621803 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.634343 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.645997 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.657165 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.668088 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.679814 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.689663 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.706858 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.716305 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.716368 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.716390 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.716415 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.716432 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.718359 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.735646 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.750420 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.766015 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.785729 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e324fc6bf6711c4736623474f414f0cbb0b17c3c705fd969c8183bb670739c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"message\\\":\\\"d *v1.Pod event handler 3\\\\nI0130 08:09:57.802952 6155 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803083 6155 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 08:09:57.803225 6155 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 08:09:57.803438 6155 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 08:09:57.803479 6155 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 08:09:57.803490 6155 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 08:09:57.803503 6155 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:57.803541 6155 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 08:09:57.803558 6155 factory.go:656] Stopping watch factory\\\\nI0130 08:09:57.803557 6155 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 08:09:57.803574 6155 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:57.803568 6155 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.795945 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.808048 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.818970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.818948 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.819002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.819012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.819025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.819034 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.830236 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.839838 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:00Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.921432 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.921466 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.921476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.921492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:00 crc kubenswrapper[4870]: I0130 08:10:00.921504 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:00Z","lastTransitionTime":"2026-01-30T08:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.024205 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.024239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.024251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.024266 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.024278 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.051224 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 08:16:28.878197588 +0000 UTC Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.074573 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.074641 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.074770 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.074797 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.074977 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.075062 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.127236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.127284 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.127297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.127316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.127329 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.229318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.229384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.229394 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.229406 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.229415 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.333095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.333133 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.333142 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.333154 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.333164 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.405295 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/1.log" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.411719 4870 scope.go:117] "RemoveContainer" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.412075 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.436843 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.436920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.436943 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.436962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.436975 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.437771 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.467506 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.482336 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.503812 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.516561 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mp9vw"] Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.517494 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.517626 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.524032 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.538329 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.539152 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.539190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.539202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.539218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.539231 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.553916 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.570207 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.585467 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.614122 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx29n\" (UniqueName: \"kubernetes.io/projected/7b976744-b72d-4291-a32f-437fc1cfbf03-kube-api-access-rx29n\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.614206 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.624690 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.642128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.642210 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.642233 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.642259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.642277 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.644474 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.662081 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.676270 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.693585 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.706744 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.714651 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.714724 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx29n\" (UniqueName: \"kubernetes.io/projected/7b976744-b72d-4291-a32f-437fc1cfbf03-kube-api-access-rx29n\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.714774 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:01 crc kubenswrapper[4870]: E0130 08:10:01.714830 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:02.214814545 +0000 UTC m=+40.910361664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.721816 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.732579 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx29n\" (UniqueName: \"kubernetes.io/projected/7b976744-b72d-4291-a32f-437fc1cfbf03-kube-api-access-rx29n\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.741665 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.744447 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.744485 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.744496 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.744510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.744518 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.754413 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.769315 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.782577 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.797841 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.812559 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.837529 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.846306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.846343 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.846354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.846370 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.846382 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.856593 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.867996 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.877048 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.887502 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.898801 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.907531 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.922973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.933743 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.943346 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.948331 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.948361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.948374 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.948391 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.948402 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:01Z","lastTransitionTime":"2026-01-30T08:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:01 crc kubenswrapper[4870]: I0130 08:10:01.956997 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:01Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.050437 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.050482 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.050499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.050515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.050526 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.051580 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 20:21:46.2298584 +0000 UTC Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.092401 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.104735 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.115435 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.128681 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.152557 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.153603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.153650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.153663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.153679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.153695 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.194481 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.217740 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:02 crc kubenswrapper[4870]: E0130 08:10:02.217855 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:02 crc kubenswrapper[4870]: E0130 08:10:02.218049 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:03.217971274 +0000 UTC m=+41.913518383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.230277 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.256360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.256619 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.256704 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.256793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.256894 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.272365 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.311632 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.352036 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.358963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.359094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.359173 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.359280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.359359 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.396383 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.436844 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.461823 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.461874 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.461914 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.461932 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.461944 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.471423 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.513608 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.551042 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.564143 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.564177 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.564188 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.564203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.564214 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.590403 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.631755 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:02Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.667327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.667457 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.667495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.667540 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.667567 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.771130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.771190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.771211 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.771241 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.771265 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.874484 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.874562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.874584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.874608 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.874624 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.977916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.977978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.978005 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.978030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:02 crc kubenswrapper[4870]: I0130 08:10:02.978049 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:02Z","lastTransitionTime":"2026-01-30T08:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.052251 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 01:42:55.691324766 +0000 UTC Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.073802 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.073831 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.073979 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.074105 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.074111 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.074296 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.074456 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.074599 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.081630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.081715 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.081741 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.081774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.081798 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.184690 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.184750 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.184768 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.184792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.184810 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.229758 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.229992 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:03 crc kubenswrapper[4870]: E0130 08:10:03.230116 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:05.230083695 +0000 UTC m=+43.925630844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.287455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.287490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.287501 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.287516 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.287528 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.390388 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.390439 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.390455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.390475 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.390489 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.492993 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.493059 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.493077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.493100 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.493124 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.596459 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.596501 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.596511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.596526 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.596534 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.699036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.699084 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.699108 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.699126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.699138 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.803123 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.803172 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.803186 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.803207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.803219 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.908815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.909092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.909565 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.909589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:03 crc kubenswrapper[4870]: I0130 08:10:03.909604 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:03Z","lastTransitionTime":"2026-01-30T08:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.013022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.013072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.013081 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.013097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.013107 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.053282 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 08:08:53.594340843 +0000 UTC Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.116030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.116065 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.116074 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.116088 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.116098 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.218955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.219047 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.219062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.219087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.219105 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.323660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.323738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.323752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.323780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.323793 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.426957 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.427044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.427071 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.427110 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.427136 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.530584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.530641 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.530654 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.530677 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.530692 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.633435 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.633488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.633505 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.633529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.633563 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.737421 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.737491 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.737512 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.737541 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.737561 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.841020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.841073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.841087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.841104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.841116 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.944193 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.944271 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.944297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.944329 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:04 crc kubenswrapper[4870]: I0130 08:10:04.944371 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:04Z","lastTransitionTime":"2026-01-30T08:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.048003 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.048052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.048073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.048093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.048107 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.053768 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 09:31:10.29792682 +0000 UTC Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.074518 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.074571 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.074550 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.074616 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.074685 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.074945 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.075154 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.075187 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.151324 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.151359 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.151368 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.151382 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.151391 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.253719 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.253838 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:05 crc kubenswrapper[4870]: E0130 08:10:05.253903 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:09.253888165 +0000 UTC m=+47.949435274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.254979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.255058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.255078 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.255104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.255124 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.357841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.357904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.357918 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.357935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.357949 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.460526 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.460573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.460592 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.460640 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.460659 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.563566 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.563602 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.563611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.563624 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.563634 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.667016 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.667311 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.667350 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.667381 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.667403 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.770397 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.770497 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.770523 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.770557 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.770582 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.873387 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.873422 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.873433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.873449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.873460 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.977044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.977135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.977152 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.977175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:05 crc kubenswrapper[4870]: I0130 08:10:05.977194 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:05Z","lastTransitionTime":"2026-01-30T08:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.006251 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.007661 4870 scope.go:117] "RemoveContainer" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" Jan 30 08:10:06 crc kubenswrapper[4870]: E0130 08:10:06.007980 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.054171 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:20:48.224421431 +0000 UTC Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.079244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.079310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.079328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.079347 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.079360 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.183588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.183669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.183688 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.183715 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.183734 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.287175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.287244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.287262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.287285 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.287302 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.390588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.390647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.390663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.390689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.390706 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.493948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.494038 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.494068 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.494102 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.494127 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.596974 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.597014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.597025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.597041 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.597054 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.700191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.700236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.700248 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.700298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.700313 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.803916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.803971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.803982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.804002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.804016 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.907413 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.907474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.907486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.907508 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:06 crc kubenswrapper[4870]: I0130 08:10:06.907519 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:06Z","lastTransitionTime":"2026-01-30T08:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.010519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.010580 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.010597 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.010624 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.010643 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.055032 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:20:10.039203289 +0000 UTC Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.073933 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.073960 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.073933 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.074034 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.074174 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.074338 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.074552 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.074612 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.113958 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.114001 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.114011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.114024 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.114033 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.217646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.217724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.217744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.217770 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.217789 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.321425 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.321479 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.321494 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.321515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.321529 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.424221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.424308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.424332 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.424353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.424370 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.527556 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.527620 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.527641 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.527668 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.527689 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.631039 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.631112 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.631134 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.631162 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.631184 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.727457 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.727531 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.727561 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.727589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.727609 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.747643 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:07Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.754164 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.754198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.754211 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.754228 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.754239 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.773119 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:07Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.779035 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.779077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.779088 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.779128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.779142 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.795240 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:07Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.800262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.800320 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.800344 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.800377 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.800399 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.822656 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:07Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.827975 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.828020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.828037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.828058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.828070 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.848257 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:07Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:07 crc kubenswrapper[4870]: E0130 08:10:07.848399 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.850803 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.850857 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.850920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.850955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.850980 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.954400 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.954496 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.954514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.954540 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:07 crc kubenswrapper[4870]: I0130 08:10:07.954561 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:07Z","lastTransitionTime":"2026-01-30T08:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.055669 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:36:59.685769235 +0000 UTC Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.058091 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.058158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.058179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.058204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.058222 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.161710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.161766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.161783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.161806 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.161824 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.264856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.264979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.264999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.265026 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.265045 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.368220 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.368283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.368298 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.368312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.368320 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.471977 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.472431 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.472521 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.472615 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.472740 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.575215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.575262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.575272 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.575290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.575300 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.678228 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.678291 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.678308 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.678332 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.678350 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.781836 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.781907 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.781922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.781940 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.781952 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.884978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.885015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.885025 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.885040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.885052 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.987433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.987465 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.987475 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.987487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:08 crc kubenswrapper[4870]: I0130 08:10:08.987495 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:08Z","lastTransitionTime":"2026-01-30T08:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.056563 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:37:09.530940519 +0000 UTC Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.074511 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.074552 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.074660 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.074748 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.074779 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.074912 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.075239 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.075162 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.090380 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.090675 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.090820 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.091015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.091162 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.193684 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.193738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.193751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.193768 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.193780 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.296690 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.296728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.296743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.296763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.296778 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.297608 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.297716 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:09 crc kubenswrapper[4870]: E0130 08:10:09.297778 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:17.297760287 +0000 UTC m=+55.993307406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.399492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.399535 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.399545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.399562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.399575 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.502204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.502326 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.502345 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.502373 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.502391 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.605213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.605283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.605306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.605335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.605357 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.708945 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.709012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.709037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.709069 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.709091 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.812359 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.812434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.812455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.812498 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.812522 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.915378 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.915453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.915474 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.915504 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:09 crc kubenswrapper[4870]: I0130 08:10:09.915527 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:09Z","lastTransitionTime":"2026-01-30T08:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.018755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.018978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.019009 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.019042 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.019059 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.057136 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 05:40:23.881853209 +0000 UTC Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.121778 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.121840 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.121859 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.121919 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.121963 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.225230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.225269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.225280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.225295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.225307 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.328982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.329062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.329104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.329136 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.329156 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.431670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.431729 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.431751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.431779 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.431799 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.534318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.534379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.534396 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.534417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.534435 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.637398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.637428 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.637438 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.637453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.637462 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.739912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.739952 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.739961 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.739974 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.739984 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.844068 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.844165 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.844183 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.844208 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.844227 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.947459 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.947563 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.947581 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.947604 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:10 crc kubenswrapper[4870]: I0130 08:10:10.947621 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:10Z","lastTransitionTime":"2026-01-30T08:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.050717 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.050848 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.050939 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.050979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.051002 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.057928 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:08:59.431165673 +0000 UTC Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.073795 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.073978 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:11 crc kubenswrapper[4870]: E0130 08:10:11.074118 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.074183 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.074264 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:11 crc kubenswrapper[4870]: E0130 08:10:11.074428 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:11 crc kubenswrapper[4870]: E0130 08:10:11.074570 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:11 crc kubenswrapper[4870]: E0130 08:10:11.074735 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.155074 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.155132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.155144 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.155163 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.155177 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.258723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.258776 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.258784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.258799 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.258808 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.362806 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.362868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.362935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.362960 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.362976 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.466390 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.466428 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.466437 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.466451 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.466465 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.569391 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.569464 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.569487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.569515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.569535 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.673230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.673287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.673304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.673328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.673344 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.775999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.776050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.776066 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.776087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.776106 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.879700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.879781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.879802 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.879826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.879842 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.983283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.983546 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.983610 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.983700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:11 crc kubenswrapper[4870]: I0130 08:10:11.983785 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:11Z","lastTransitionTime":"2026-01-30T08:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.058732 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 15:04:54.660144608 +0000 UTC Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.086852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.087019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.087046 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.087069 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.087131 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.094615 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.122556 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.139076 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.156534 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.180821 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.190316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.190384 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.190402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.190429 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.190447 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.198441 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.220482 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.243194 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.260207 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.278312 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.293645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.293692 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.293702 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.293719 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.293728 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.295354 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.310336 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.342790 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.364281 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.380195 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.393029 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.396102 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.396244 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.396261 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.396279 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.396290 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.409143 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:12Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.499200 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.499235 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.499246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.499261 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.499274 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.601629 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.601700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.601724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.601755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.601777 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.704337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.704403 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.704419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.704442 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.704463 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.807573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.807649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.807675 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.807705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.807728 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.841227 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.841389 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:10:44.841359106 +0000 UTC m=+83.536906255 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.910631 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.910673 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.910684 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.910699 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.910710 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:12Z","lastTransitionTime":"2026-01-30T08:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.942527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.942620 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.942717 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.942810 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.942853 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.942913 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.942832 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:44.942803615 +0000 UTC m=+83.638350764 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.942737 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943020 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:44.942990701 +0000 UTC m=+83.638537850 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:12 crc kubenswrapper[4870]: I0130 08:10:12.943058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943085 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943179 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943197 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943210 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943273 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:44.943254159 +0000 UTC m=+83.638801268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:12 crc kubenswrapper[4870]: E0130 08:10:12.943312 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:44.94328393 +0000 UTC m=+83.638831099 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.013654 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.013713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.013731 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.013763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.013781 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.060351 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:30:08.99239544 +0000 UTC Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.074287 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.074344 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.074395 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.074325 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:13 crc kubenswrapper[4870]: E0130 08:10:13.074478 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:13 crc kubenswrapper[4870]: E0130 08:10:13.074608 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:13 crc kubenswrapper[4870]: E0130 08:10:13.074730 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:13 crc kubenswrapper[4870]: E0130 08:10:13.074944 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.116687 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.116760 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.116778 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.116805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.116823 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.220414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.220481 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.220499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.220522 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.220542 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.323725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.323781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.323801 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.323830 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.323850 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.426942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.427019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.427046 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.427119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.427144 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.529663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.529724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.529746 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.529773 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.529795 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.633413 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.633471 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.633489 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.633514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.633533 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.724864 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739403 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739746 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.739759 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.747815 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.771776 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.792445 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.810196 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.825591 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.843412 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.845223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.845264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.845281 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.845304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.845321 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.870376 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.888581 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.902933 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.913290 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.926517 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.940799 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.948855 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.948926 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.948939 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.948960 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.948973 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:13Z","lastTransitionTime":"2026-01-30T08:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.952158 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.982576 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:13 crc kubenswrapper[4870]: I0130 08:10:13.998031 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:13Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.014337 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:14Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.037337 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:14Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.051401 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.051460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.051483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.051514 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.051538 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.060928 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 12:06:10.605915443 +0000 UTC Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.154819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.154868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.154899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.154920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.154931 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.258686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.258744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.258760 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.258784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.258803 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.361660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.361709 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.361728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.361755 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.361773 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.464109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.464165 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.464182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.464205 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.464222 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.566697 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.566767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.566789 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.566819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.566841 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.669434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.669477 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.669487 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.669503 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.669516 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.771951 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.772011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.772029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.772056 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.772074 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.874953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.874999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.875014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.875034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.875050 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.977405 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.977438 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.977448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.977461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:14 crc kubenswrapper[4870]: I0130 08:10:14.977473 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:14Z","lastTransitionTime":"2026-01-30T08:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.061406 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 15:02:09.733681203 +0000 UTC Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.074125 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.074191 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:15 crc kubenswrapper[4870]: E0130 08:10:15.074238 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.074131 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.074134 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:15 crc kubenswrapper[4870]: E0130 08:10:15.074312 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:15 crc kubenswrapper[4870]: E0130 08:10:15.074503 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:15 crc kubenswrapper[4870]: E0130 08:10:15.074564 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.081079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.081130 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.081147 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.081166 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.081182 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.184258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.184319 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.184338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.184363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.184383 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.286781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.286821 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.286831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.286848 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.286862 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.389948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.390023 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.390045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.390077 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.390095 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.493139 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.493183 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.493201 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.493222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.493239 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.599184 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.599461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.599549 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.599656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.599756 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.703027 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.703096 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.703119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.703147 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.703165 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.805582 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.805817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.805964 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.806034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.806097 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.908610 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.908687 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.908705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.908724 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:15 crc kubenswrapper[4870]: I0130 08:10:15.908738 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:15Z","lastTransitionTime":"2026-01-30T08:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.012029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.012093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.012111 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.012137 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.012156 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.061990 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:06:39.153869656 +0000 UTC Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.116031 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.116097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.116115 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.116139 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.116196 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.219597 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.219657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.219671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.219698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.219713 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.323345 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.323747 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.323935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.324120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.324278 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.426497 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.426542 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.426554 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.426574 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.426586 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.531395 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.531472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.531510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.531543 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.531566 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.634459 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.634603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.634633 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.634656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.634674 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.738187 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.738276 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.738309 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.738338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.738360 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.841632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.841700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.841713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.841730 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.841742 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.946417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.946452 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.946462 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.946480 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:16 crc kubenswrapper[4870]: I0130 08:10:16.946491 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:16Z","lastTransitionTime":"2026-01-30T08:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.049367 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.049424 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.049443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.049471 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.049488 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.062779 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 08:11:55.083827739 +0000 UTC Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.074332 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.074361 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.074430 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.074430 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.074478 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.074648 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.074678 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.074860 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.152689 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.152767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.152785 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.152815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.152834 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.255803 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.255856 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.255872 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.255919 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.255934 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.358490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.358725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.358740 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.358761 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.358775 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.391526 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.391795 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.391939 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:10:33.391867834 +0000 UTC m=+72.087414983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.462208 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.462518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.462811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.463254 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.463492 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.566075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.566195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.566223 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.566256 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.566278 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.669604 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.669649 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.669661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.669680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.669691 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.772097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.772125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.772133 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.772145 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.772153 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.874491 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.874548 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.874561 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.874581 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.874595 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.961915 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.961951 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.961962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.961978 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.961988 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:17 crc kubenswrapper[4870]: E0130 08:10:17.982132 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:17Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.992190 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.992273 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.992294 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.992321 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:17 crc kubenswrapper[4870]: I0130 08:10:17.992348 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:17Z","lastTransitionTime":"2026-01-30T08:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: E0130 08:10:18.013386 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.017565 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.017632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.017653 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.017682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.017708 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: E0130 08:10:18.038245 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.043098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.043151 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.043171 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.043195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.043212 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: E0130 08:10:18.059202 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.063437 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 08:46:51.111240993 +0000 UTC Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.064660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.064726 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.064743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.064768 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.064786 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.075331 4870 scope.go:117] "RemoveContainer" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" Jan 30 08:10:18 crc kubenswrapper[4870]: E0130 08:10:18.087155 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: E0130 08:10:18.087759 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.090256 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.090297 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.090314 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.090337 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.090355 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.193571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.193631 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.193653 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.193682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.193704 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.297107 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.297608 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.297627 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.297652 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.297668 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.401544 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.401579 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.401591 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.401611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.401625 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.476561 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/1.log" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.481680 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.484016 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.499902 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.504935 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.504979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.504998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.505020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.505036 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.517869 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.536330 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.554285 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.574811 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.598038 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.607730 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.607778 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.607790 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.607805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.607816 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.618348 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.645226 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.658680 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.672147 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.682658 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.709899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.709980 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.709993 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.710011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.710032 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.711791 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.733758 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.753219 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.785224 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.812508 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.826929 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.836146 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:18Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.839835 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.839906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.839920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.839940 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.839951 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.943162 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.943206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.943218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.943234 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:18 crc kubenswrapper[4870]: I0130 08:10:18.943245 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:18Z","lastTransitionTime":"2026-01-30T08:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.046052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.046127 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.046148 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.046176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.046199 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.064386 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 19:19:48.76073837 +0000 UTC Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.074001 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.074051 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.074071 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.074015 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:19 crc kubenswrapper[4870]: E0130 08:10:19.074183 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:19 crc kubenswrapper[4870]: E0130 08:10:19.074246 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:19 crc kubenswrapper[4870]: E0130 08:10:19.074320 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:19 crc kubenswrapper[4870]: E0130 08:10:19.074409 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.149231 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.149269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.149279 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.149293 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.149302 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.251819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.251870 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.251914 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.251938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.251956 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.354869 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.355000 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.355020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.355047 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.355065 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.457781 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.457834 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.457852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.457919 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.457940 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.489190 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/2.log" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.490386 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/1.log" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.495359 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" exitCode=1 Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.495429 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.495530 4870 scope.go:117] "RemoveContainer" containerID="bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.497035 4870 scope.go:117] "RemoveContainer" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" Jan 30 08:10:19 crc kubenswrapper[4870]: E0130 08:10:19.497393 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.518275 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.535348 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.552327 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.561057 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.561106 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.561124 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.561148 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.561167 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.566814 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.580919 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.592505 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.602763 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.625856 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.639893 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.650522 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.663231 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.663259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.663269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.663281 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.663291 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.671123 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.698770 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfadb4b11c7243c5de202188c1e62ac35d26e29e9c6a1c53063369253c1e2483\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"message\\\":\\\"ancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI0130 08:09:59.417960 6308 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.774229ms\\\\nI0130 08:09:59.418058 6308 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418156 6308 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 08:09:59.418188 6308 factory.go:656] Stopping watch factory\\\\nI0130 08:09:59.418151 6308 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}\\\\nI0130 08:09:59.418201 6308 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0130 08:09:59.418201 6308 services_controller.go:360] Finished syncing service machine-api-operator-machine-webhook on namespace openshift-machine-api for network=default : 3.106099ms\\\\nI0130 08:09:59.418213 6308 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 08:09:59.418248 6308 ovnkube.go:599] Stopped ovnkube\\\\nI0130 08:09:59.418286 6308 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0130 08:09:59.418396 6308 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.710346 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.728750 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.746000 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.760042 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.765271 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.765313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.765323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.765340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.765355 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.780630 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.799188 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.867793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.867868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.867901 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.867919 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.867931 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.971292 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.971331 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.971340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.971354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:19 crc kubenswrapper[4870]: I0130 08:10:19.971363 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:19Z","lastTransitionTime":"2026-01-30T08:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.064547 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 11:37:40.559730887 +0000 UTC Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.073574 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.073612 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.073621 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.073635 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.073643 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.176671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.176725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.176744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.176768 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.176786 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.280410 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.280473 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.280490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.280513 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.280531 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.383512 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.383556 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.383567 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.383583 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.383596 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.486673 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.486718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.486729 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.486744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.486756 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.501193 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/2.log" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.505190 4870 scope.go:117] "RemoveContainer" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" Jan 30 08:10:20 crc kubenswrapper[4870]: E0130 08:10:20.505382 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.519902 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.537627 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.553316 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.570130 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.588744 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.590868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.590955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.590972 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.590997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.591016 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.605017 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.638515 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.658703 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.675042 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.693783 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.693841 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.693857 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.693906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.693925 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.701658 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.729658 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.745212 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.762868 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.783190 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.796727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.796777 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.796792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.796811 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.796824 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.798477 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.816699 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.832105 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.846390 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:20Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.900010 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.900063 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.900075 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.900092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:20 crc kubenswrapper[4870]: I0130 08:10:20.900106 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:20Z","lastTransitionTime":"2026-01-30T08:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.003268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.003303 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.003312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.003328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.003340 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.064741 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:02:34.005110745 +0000 UTC Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.074394 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.074510 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:21 crc kubenswrapper[4870]: E0130 08:10:21.074683 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.074712 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.074730 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:21 crc kubenswrapper[4870]: E0130 08:10:21.074828 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:21 crc kubenswrapper[4870]: E0130 08:10:21.075017 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:21 crc kubenswrapper[4870]: E0130 08:10:21.075111 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.106630 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.106696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.106720 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.106749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.106772 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.209600 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.209671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.209695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.209728 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.209754 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.313166 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.313232 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.313256 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.313287 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.313311 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.416280 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.416328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.416338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.416353 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.416364 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.519029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.519139 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.519160 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.519182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.519241 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.622203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.622265 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.622283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.622312 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.622334 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.725225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.725268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.725277 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.725292 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.725301 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.827963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.828004 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.828015 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.828029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.828041 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.931191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.931268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.931290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.931320 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:21 crc kubenswrapper[4870]: I0130 08:10:21.931342 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:21Z","lastTransitionTime":"2026-01-30T08:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.035207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.035267 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.035282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.035304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.035323 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.065444 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 05:14:16.569602601 +0000 UTC Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.098262 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.130205 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.138076 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.138137 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.138156 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.138181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.138198 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.145777 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.163147 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.177110 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.195184 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.210669 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.231019 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.245174 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.245235 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.245255 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.245293 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.245314 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.251077 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.270831 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.289973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.310533 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.324507 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.346436 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.348022 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.348083 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.348101 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.348149 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.348168 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.361848 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.384701 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.399098 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.411999 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:22Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.452374 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.452448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.452467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.452492 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.452510 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.555461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.555554 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.555573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.555955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.555976 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.659185 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.659247 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.659316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.659347 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.659368 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.762538 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.762600 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.762617 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.762639 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.762658 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.866392 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.866455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.866478 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.866510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.866532 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.969923 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.969984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.970007 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.970035 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:22 crc kubenswrapper[4870]: I0130 08:10:22.970057 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:22Z","lastTransitionTime":"2026-01-30T08:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.066273 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:16:42.085473252 +0000 UTC Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.072705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.072764 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.072780 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.072805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.072823 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.073625 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.073693 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.073709 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.073631 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:23 crc kubenswrapper[4870]: E0130 08:10:23.073787 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:23 crc kubenswrapper[4870]: E0130 08:10:23.073976 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:23 crc kubenswrapper[4870]: E0130 08:10:23.074117 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:23 crc kubenswrapper[4870]: E0130 08:10:23.074214 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.175335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.175404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.175421 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.175491 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.175518 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.279751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.279829 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.279863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.279949 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.279974 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.383512 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.383574 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.383590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.383617 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.383633 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.487209 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.487282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.487306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.487339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.487361 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.590696 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.590737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.590749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.590766 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.590778 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.693526 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.693570 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.693580 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.693596 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.693606 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.797523 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.797556 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.797564 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.797579 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.797589 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.900657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.900720 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.900732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.900751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:23 crc kubenswrapper[4870]: I0130 08:10:23.900763 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:23Z","lastTransitionTime":"2026-01-30T08:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.003632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.003675 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.003686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.003704 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.003715 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.067186 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 21:06:27.472271315 +0000 UTC Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.106176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.106236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.106246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.106261 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.106272 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.209401 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.209456 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.209467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.209486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.209497 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.311655 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.311706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.311716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.311732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.311741 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.414940 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.414990 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.415005 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.415026 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.415039 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.517787 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.517851 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.517871 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.517924 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.517942 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.621109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.621191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.621210 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.621236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.621255 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.724812 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.724942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.724963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.724990 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.725007 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.828525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.828579 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.828597 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.828618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.828634 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.932445 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.932520 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.932534 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.932555 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:24 crc kubenswrapper[4870]: I0130 08:10:24.932569 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:24Z","lastTransitionTime":"2026-01-30T08:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.035302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.035336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.035347 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.035362 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.035372 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.068058 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:12:08.066775524 +0000 UTC Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.073606 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:25 crc kubenswrapper[4870]: E0130 08:10:25.073764 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.074067 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:25 crc kubenswrapper[4870]: E0130 08:10:25.074178 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.074389 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:25 crc kubenswrapper[4870]: E0130 08:10:25.074511 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.075025 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:25 crc kubenswrapper[4870]: E0130 08:10:25.075132 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.139115 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.139174 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.139215 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.139249 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.139271 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.242826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.242904 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.242922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.242948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.242970 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.346427 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.346489 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.346504 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.346535 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.346553 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.449911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.449973 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.449994 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.450019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.450037 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.554150 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.554206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.554217 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.554240 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.554258 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.657019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.657073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.657084 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.657103 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.657119 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.760518 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.760581 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.760592 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.760615 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.760630 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.864062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.864187 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.864206 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.864230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.864250 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.966829 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.966962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.966999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.967030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:25 crc kubenswrapper[4870]: I0130 08:10:25.967047 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:25Z","lastTransitionTime":"2026-01-30T08:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.068195 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 06:12:24.777881352 +0000 UTC Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.071126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.071174 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.071189 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.071211 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.071227 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.174245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.174290 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.174303 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.174324 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.174336 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.277431 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.277478 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.277490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.277507 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.277518 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.380609 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.380666 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.380678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.380698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.380711 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.483531 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.483577 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.483591 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.483608 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.483622 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.586916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.586971 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.586982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.587002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.587015 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.689641 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.689691 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.689703 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.689721 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.689734 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.792371 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.792550 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.792645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.792679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.792706 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.895021 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.895059 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.895070 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.895086 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.895098 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.997433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.997483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.997500 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.997520 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:26 crc kubenswrapper[4870]: I0130 08:10:26.997535 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:26Z","lastTransitionTime":"2026-01-30T08:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.069040 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:36:16.227977342 +0000 UTC Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.074421 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.074460 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.074495 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.074601 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:27 crc kubenswrapper[4870]: E0130 08:10:27.074753 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:27 crc kubenswrapper[4870]: E0130 08:10:27.074903 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:27 crc kubenswrapper[4870]: E0130 08:10:27.075014 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:27 crc kubenswrapper[4870]: E0130 08:10:27.075141 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.100448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.100513 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.100525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.100545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.100560 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.202571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.202606 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.202618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.202632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.202645 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.305067 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.305094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.305105 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.305120 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.305131 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.407281 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.407313 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.407324 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.407340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.407351 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.509635 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.509680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.509690 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.509710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.509725 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.612483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.612545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.612565 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.612593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.612615 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.714968 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.715006 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.715016 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.715031 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.715042 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.817034 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.817060 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.817198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.817224 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.817237 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.919749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.919790 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.919800 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.919817 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:27 crc kubenswrapper[4870]: I0130 08:10:27.919827 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:27Z","lastTransitionTime":"2026-01-30T08:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.022399 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.022443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.022451 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.022467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.022477 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.069385 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 10:54:17.093810516 +0000 UTC Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.125912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.125955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.125966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.125984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.125996 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.227713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.228008 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.228097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.228173 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.228251 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.331385 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.331449 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.331465 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.331488 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.331506 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.434473 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.434560 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.434573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.434589 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.434616 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.476960 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.477017 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.477037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.477065 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.477081 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.492037 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:28Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.496699 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.496739 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.496769 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.496792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.496806 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.511191 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:28Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.515418 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.515471 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.515486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.515508 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.515523 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.527302 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:28Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.531744 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.531779 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.531792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.531814 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.531829 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.551404 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:28Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.555181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.555234 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.555247 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.555269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.555284 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.567449 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:28Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:28 crc kubenswrapper[4870]: E0130 08:10:28.567684 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.569637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.569698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.569711 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.569733 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.569747 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.672991 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.673048 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.673058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.673078 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.673090 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.776014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.776062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.776072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.776093 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.776114 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.879984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.880023 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.880032 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.880049 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.880059 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.982831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.982928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.982945 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.982974 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:28 crc kubenswrapper[4870]: I0130 08:10:28.982993 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:28Z","lastTransitionTime":"2026-01-30T08:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.070564 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 21:02:15.724052842 +0000 UTC Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.073987 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.074048 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.074048 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:29 crc kubenswrapper[4870]: E0130 08:10:29.074136 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:29 crc kubenswrapper[4870]: E0130 08:10:29.074272 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.074297 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:29 crc kubenswrapper[4870]: E0130 08:10:29.074366 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:29 crc kubenswrapper[4870]: E0130 08:10:29.074418 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.085368 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.085399 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.085411 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.085427 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.085439 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.187586 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.187650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.187659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.187684 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.187698 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.290850 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.290902 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.290911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.290929 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.290942 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.392999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.393037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.393048 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.393066 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.393079 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.495762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.495819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.495830 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.495849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.495861 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.598544 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.598584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.598593 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.598611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.598621 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.701174 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.701221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.701230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.701245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.701257 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.804083 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.804179 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.804203 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.804228 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.804246 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.906844 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.906908 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.906922 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.906944 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:29 crc kubenswrapper[4870]: I0130 08:10:29.906959 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:29Z","lastTransitionTime":"2026-01-30T08:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.010398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.010463 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.010476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.010500 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.010513 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.071024 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 03:52:27.537411847 +0000 UTC Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.113365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.113397 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.113408 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.113423 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.113434 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.216039 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.216090 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.216106 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.216132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.216144 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.318740 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.318767 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.318775 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.318787 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.318796 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.421150 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.421195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.421204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.421218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.421226 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.524376 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.524430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.524444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.524467 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.524481 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.626551 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.626601 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.626614 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.626632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.626660 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.729493 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.729539 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.729550 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.729567 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.729579 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.832265 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.832356 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.832372 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.832395 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.832410 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.935030 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.935074 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.935085 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.935101 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:30 crc kubenswrapper[4870]: I0130 08:10:30.935112 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:30Z","lastTransitionTime":"2026-01-30T08:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.037636 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.037673 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.037683 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.037698 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.037712 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.071773 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:16:48.587793481 +0000 UTC Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.074033 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.074051 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.074065 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:31 crc kubenswrapper[4870]: E0130 08:10:31.074176 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.074200 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:31 crc kubenswrapper[4870]: E0130 08:10:31.074330 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:31 crc kubenswrapper[4870]: E0130 08:10:31.074383 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:31 crc kubenswrapper[4870]: E0130 08:10:31.074438 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.140325 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.140387 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.140398 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.140419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.140433 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.243482 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.243521 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.243529 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.243544 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.243558 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.348295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.348355 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.348406 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.348432 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.348446 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.451831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.451906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.451917 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.451970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.451988 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.554852 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.554927 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.554938 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.554961 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.554977 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.657645 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.657711 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.657725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.657749 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.657765 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.760914 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.760966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.760975 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.760992 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.761003 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.863774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.863819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.863838 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.863861 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.863892 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.966688 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.966753 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.966763 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.966784 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:31 crc kubenswrapper[4870]: I0130 08:10:31.966796 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:31Z","lastTransitionTime":"2026-01-30T08:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.069346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.069404 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.069417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.069437 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.069447 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.072737 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:27:59.136390953 +0000 UTC Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.074812 4870 scope.go:117] "RemoveContainer" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" Jan 30 08:10:32 crc kubenswrapper[4870]: E0130 08:10:32.075009 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.096159 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.115822 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.141816 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.156802 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.170846 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.170900 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.170915 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.170934 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.170948 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.173317 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.188196 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.201997 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.217106 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.231063 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.242465 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.255240 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.270968 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.273158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.273217 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.273230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.273251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.273271 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.285720 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.301303 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.319934 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.334758 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.354210 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.368742 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:32Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.375316 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.375356 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.375367 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.375383 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.375394 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.478501 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.478559 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.478573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.478594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.478626 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.581661 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.581713 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.581723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.581743 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.581756 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.684975 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.685024 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.685040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.685064 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.685082 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.787414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.787458 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.787470 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.787486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.787498 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.889454 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.889486 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.889495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.889510 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.889520 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.991656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.991695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.991704 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.991718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:32 crc kubenswrapper[4870]: I0130 08:10:32.991728 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:32Z","lastTransitionTime":"2026-01-30T08:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.073712 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:50:40.573676243 +0000 UTC Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.073954 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.073950 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.074055 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.074175 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.074394 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.074466 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.074630 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.074781 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.094251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.094282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.094291 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.094303 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.094312 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.197007 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.197072 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.197087 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.197109 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.197127 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.299306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.299339 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.299348 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.299361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.299370 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.402537 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.402750 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.402843 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.402942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.403016 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.471284 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.471511 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:33 crc kubenswrapper[4870]: E0130 08:10:33.471576 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:11:05.471553758 +0000 UTC m=+104.167100877 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.506894 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.506954 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.506964 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.506989 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.507001 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.610050 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.610330 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.610402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.610466 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.610522 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.713045 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.713080 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.713089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.713104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.713115 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.816389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.816430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.816443 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.816459 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.816471 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.918676 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.918948 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.919029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.919146 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:33 crc kubenswrapper[4870]: I0130 08:10:33.919239 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:33Z","lastTransitionTime":"2026-01-30T08:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.021185 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.021227 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.021238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.021258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.021268 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.074481 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 14:33:45.734271364 +0000 UTC Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.124204 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.124275 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.124293 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.124318 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.124335 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.226950 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.227019 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.227036 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.227062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.227079 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.329430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.329792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.329921 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.330041 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.330130 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.433236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.433289 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.433302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.433323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.433335 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.536067 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.536377 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.536451 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.536512 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.536593 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.639911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.639976 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.639989 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.640014 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.640030 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.742273 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.742484 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.742571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.742637 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.742706 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.845937 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.845989 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.845999 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.846020 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.846057 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.948378 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.948426 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.948444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.948464 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:34 crc kubenswrapper[4870]: I0130 08:10:34.948479 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:34Z","lastTransitionTime":"2026-01-30T08:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.052195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.052258 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.052270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.052294 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.052309 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.074679 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.074746 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.074701 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:10:57.674195595 +0000 UTC Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.074895 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:35 crc kubenswrapper[4870]: E0130 08:10:35.074997 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.075142 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:35 crc kubenswrapper[4870]: E0130 08:10:35.075270 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:35 crc kubenswrapper[4870]: E0130 08:10:35.075412 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:35 crc kubenswrapper[4870]: E0130 08:10:35.075708 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.156255 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.156311 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.156327 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.156354 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.156372 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.260012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.260081 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.260100 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.260132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.260152 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.362727 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.362798 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.362821 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.362849 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.362912 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.466389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.466495 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.466511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.466535 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.466550 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.552019 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/0.log" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.552095 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e8e9e25-2b9b-4820-8282-48e1d930a721" containerID="f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a" exitCode=1 Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.552149 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerDied","Data":"f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.552822 4870 scope.go:117] "RemoveContainer" containerID="f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.569923 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.569955 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.569965 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.569982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.569996 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.583335 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.603065 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.619033 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.634632 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.651629 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.664451 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.676538 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.676585 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.676594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.676611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.676622 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.677808 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.702518 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.717743 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.729351 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.746324 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.761774 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.776189 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.779221 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.779252 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.779262 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.779279 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.779290 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.792353 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.807828 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.824316 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.838643 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.857973 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:35Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.881669 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.881710 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.881725 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.881742 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.881755 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.983598 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.983628 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.983635 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.983648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:35 crc kubenswrapper[4870]: I0130 08:10:35.983657 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:35Z","lastTransitionTime":"2026-01-30T08:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.076124 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:26:41.724408914 +0000 UTC Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.085378 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.085402 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.085411 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.085424 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.085435 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.188369 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.188424 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.188433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.188453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.188464 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.290963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.290995 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.291005 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.291021 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.291031 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.393648 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.393706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.393716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.393737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.393752 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.496253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.496324 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.496344 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.496366 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.496384 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.558589 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/0.log" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.558686 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerStarted","Data":"e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.577275 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.589612 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.600912 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.600950 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.600959 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.600974 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.600984 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.601231 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.611712 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.631742 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.645150 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.655643 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.663423 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.673502 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.689053 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.704127 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.704171 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.704181 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.704197 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.704209 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.722016 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.737368 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.751005 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.765138 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.780952 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.796187 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.807336 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.807385 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.807395 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.807414 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.807429 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.812747 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.828524 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:36Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.910861 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.910928 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.910942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.910962 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:36 crc kubenswrapper[4870]: I0130 08:10:36.910975 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:36Z","lastTransitionTime":"2026-01-30T08:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.013184 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.013246 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.013261 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.013283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.013296 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.073832 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.073907 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.073931 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.074009 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:37 crc kubenswrapper[4870]: E0130 08:10:37.074074 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:37 crc kubenswrapper[4870]: E0130 08:10:37.074181 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:37 crc kubenswrapper[4870]: E0130 08:10:37.074324 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:37 crc kubenswrapper[4870]: E0130 08:10:37.074452 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.077083 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 13:45:14.885537041 +0000 UTC Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.115606 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.115658 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.115674 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.115694 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.115711 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.218195 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.218237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.218245 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.218259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.218269 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.320253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.320295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.320306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.320321 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.320332 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.423544 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.423584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.423596 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.423616 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.423630 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.526089 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.526144 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.526155 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.526176 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.526192 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.628595 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.628635 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.628644 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.628671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.628682 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.732207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.732259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.732270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.732291 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.732305 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.835408 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.835459 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.835476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.835502 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.835519 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.938702 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.938737 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.938748 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.938762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:37 crc kubenswrapper[4870]: I0130 08:10:37.938775 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:37Z","lastTransitionTime":"2026-01-30T08:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.040826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.040864 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.040899 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.040916 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.040928 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.077397 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 13:37:50.40981678 +0000 UTC Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.143158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.143209 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.143226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.143247 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.143264 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.245638 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.245670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.245679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.245695 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.245706 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.347547 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.347659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.347678 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.347702 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.347720 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.450419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.450471 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.450490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.450515 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.450538 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.554483 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.554523 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.554531 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.554545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.554553 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.657863 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.657911 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.657921 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.657936 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.657946 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.759936 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.759965 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.759975 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.759988 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.759998 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.823738 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.823806 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.823829 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.823925 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.823954 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.839352 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:38Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.844048 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.844104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.844127 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.844155 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.844177 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.864308 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:38Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.869303 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.869352 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.869369 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.869393 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.869410 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.883626 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:38Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.888216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.888259 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.888270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.888288 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.888300 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.905870 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:38Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.911037 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.911097 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.911121 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.911147 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.911167 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.925382 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42bb4058-de5f-47d3-b90e-bda57dd064e9\\\",\\\"systemUUID\\\":\\\"7dbac932-0e54-4045-a1f0-fa334c8e1b7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:38Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:38 crc kubenswrapper[4870]: E0130 08:10:38.925529 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.927158 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.927189 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.927198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.927213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:38 crc kubenswrapper[4870]: I0130 08:10:38.927226 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:38Z","lastTransitionTime":"2026-01-30T08:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.030448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.030522 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.030544 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.030573 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.030599 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.074551 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.074610 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.074549 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.074696 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:39 crc kubenswrapper[4870]: E0130 08:10:39.074720 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:39 crc kubenswrapper[4870]: E0130 08:10:39.074856 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:39 crc kubenswrapper[4870]: E0130 08:10:39.075077 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:39 crc kubenswrapper[4870]: E0130 08:10:39.075209 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.077834 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:27:35.347328825 +0000 UTC Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.133610 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.133660 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.133679 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.133705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.133724 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.236275 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.236338 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.236361 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.236392 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.236418 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.338559 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.338603 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.338613 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.338627 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.338638 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.440920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.440944 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.440953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.440966 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.440974 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.544519 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.544595 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.544618 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.544650 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.544675 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.646857 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.646934 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.646952 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.646976 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.646990 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.749051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.749102 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.749122 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.749144 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.749161 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.851304 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.851340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.851348 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.851362 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.851372 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.954152 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.954207 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.954225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.954247 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:39 crc kubenswrapper[4870]: I0130 08:10:39.954264 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:39Z","lastTransitionTime":"2026-01-30T08:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.057595 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.057667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.057705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.057731 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.057750 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.078812 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 06:45:51.814333235 +0000 UTC Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.160909 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.160969 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.160989 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.161011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.161028 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.264305 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.264419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.264436 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.264452 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.264465 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.367362 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.367424 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.367441 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.367462 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.367477 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.470984 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.471044 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.471062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.471085 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.471101 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.573462 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.573532 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.573545 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.573562 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.573594 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.677582 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.677623 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.677632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.677646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.677656 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.780752 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.780793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.780805 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.780823 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.780835 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.884257 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.884299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.884307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.884322 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.884332 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.987360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.987430 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.987444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.987466 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:40 crc kubenswrapper[4870]: I0130 08:10:40.987483 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:40Z","lastTransitionTime":"2026-01-30T08:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.073985 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.074024 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.074001 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.074128 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:41 crc kubenswrapper[4870]: E0130 08:10:41.074154 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:41 crc kubenswrapper[4870]: E0130 08:10:41.074261 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:41 crc kubenswrapper[4870]: E0130 08:10:41.074426 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:41 crc kubenswrapper[4870]: E0130 08:10:41.074564 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.079935 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 13:26:21.890386248 +0000 UTC Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.090525 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.090560 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.090571 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.090588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.090601 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.192611 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.192644 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.192653 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.192667 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.192676 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.295403 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.295450 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.295461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.295481 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.295493 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.398128 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.398213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.398239 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.398270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.398292 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.501644 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.501705 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.501719 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.501741 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.501762 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.604923 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.604979 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.604993 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.605011 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.605025 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.708092 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.708143 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.708171 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.708191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.708203 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.810175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.810216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.810225 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.810238 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.810247 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.913126 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.913191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.913202 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.913235 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:41 crc kubenswrapper[4870]: I0130 08:10:41.913248 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:41Z","lastTransitionTime":"2026-01-30T08:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.015365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.015407 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.015419 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.015436 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.015450 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.080307 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:30:13.061444685 +0000 UTC Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.093845 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.112632 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.117575 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.117626 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.117640 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.117658 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.117669 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.133284 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.153404 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.180456 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.199178 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.214816 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.225040 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.225360 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.225554 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.225782 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.226014 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.232149 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.248303 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.262070 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.276545 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.294167 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.307969 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.328592 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.328636 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.328647 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.328664 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.328676 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.332573 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.346708 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.362681 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.382677 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.409322 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:42Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.432434 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.432490 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.432502 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.432521 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.432535 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.535237 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.535295 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.535307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.535328 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.535340 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.637831 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.637942 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.637952 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.637970 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.637980 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.740448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.740493 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.740502 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.740517 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.740528 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.843659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.843706 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.843716 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.843732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.843743 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.947920 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.947973 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.947990 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.948012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:42 crc kubenswrapper[4870]: I0130 08:10:42.948029 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:42Z","lastTransitionTime":"2026-01-30T08:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.050732 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.050792 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.050983 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.051055 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.051085 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.073800 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.073918 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.074015 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:43 crc kubenswrapper[4870]: E0130 08:10:43.074245 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.074350 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:43 crc kubenswrapper[4870]: E0130 08:10:43.074474 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:43 crc kubenswrapper[4870]: E0130 08:10:43.075377 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:43 crc kubenswrapper[4870]: E0130 08:10:43.075506 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.075981 4870 scope.go:117] "RemoveContainer" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.081427 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:41:50.574799766 +0000 UTC Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.155324 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.155801 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.155813 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.155832 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.155845 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.258682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.258735 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.258747 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.258772 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.258785 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.361104 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.361148 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.361162 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.361180 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.361192 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.464043 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.464091 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.464112 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.464135 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.464150 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.568241 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.568310 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.568345 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.568386 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.568407 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.594951 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/2.log" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.615419 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.616296 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.639256 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.657856 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.672218 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.672268 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.672282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.672302 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.672315 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.681594 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.698228 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.709939 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.724532 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.737643 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.764793 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.782635 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.793748 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.830327 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.836670 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.836711 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.836722 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.836740 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.836752 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.841983 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.861077 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.871895 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.883677 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.895725 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.906123 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.915980 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:43Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.938923 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.938963 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.938980 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.939002 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:43 crc kubenswrapper[4870]: I0130 08:10:43.939019 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:43Z","lastTransitionTime":"2026-01-30T08:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.041389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.041444 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.041453 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.041472 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.041482 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.082496 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:37:22.24357326 +0000 UTC Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.144868 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.144941 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.144953 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.144974 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.144985 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.248762 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.248803 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.248818 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.248838 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.248852 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.352159 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.352264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.352278 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.352307 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.352323 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.455059 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.455118 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.455132 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.455152 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.455164 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.558511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.558594 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.558627 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.558658 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.558683 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.623307 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/3.log" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.624077 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/2.log" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.628327 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" exitCode=1 Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.628394 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.628445 4870 scope.go:117] "RemoveContainer" containerID="1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.629083 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:10:44 crc kubenswrapper[4870]: E0130 08:10:44.629286 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.647251 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.662095 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.662191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.662213 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.662282 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.662304 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.665610 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.684432 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.705738 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.724304 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.740908 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.761139 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.766555 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.766591 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.766608 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.766628 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.766642 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.775768 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.792329 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.823465 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.848911 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.868056 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.870417 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.870448 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.870460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.870476 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.870488 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.883315 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.896434 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.898798 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:10:44 crc kubenswrapper[4870]: E0130 08:10:44.898947 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:48.898925314 +0000 UTC m=+147.594472433 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.913232 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.935569 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1da023d97f1e045ab17e1a0d5c12dea48d7a49ef948846242561a014653e7a1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:19Z\\\",\\\"message\\\":\\\"ne-config-daemon-j4sd8\\\\nI0130 08:10:19.196562 6529 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 in node crc\\\\nF0130 08:10:19.196560 6529 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:19Z is after 2025-08-24T17:21:41Z]\\\\nI0130 08:10:19.196573 6529 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8 after 0 failed attempt(s)\\\\nI0130 08:10:19.196569 6529 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-8kvt7\\\\nI0130 08:10:19.196583 6529 obj_retry.go:303] Retry object setup: *v1.Po\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:44Z\\\",\\\"message\\\":\\\"ontroller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-cj5db\\\\nI0130 08:10:44.228252 6956 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0130 08:10:44.228262 6956 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0130 08:10:44.228249 6956 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0130 08:10:44.228275 6956 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc after 0 failed attempt(s)\\\\nI0130 08:10:44.228279 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq\\\\nI0130 08:10:44.228289 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8\\\\nI0130 08:10:44.228296 6956 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8\\\\nI0130 08:10:44.228299 6956 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq\\\\nI0130 08:10:44.228311 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-mp9vw\\\\nI0130 08:10:44.2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.952191 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.966490 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:44Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.973269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.973299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.973309 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.973325 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:44 crc kubenswrapper[4870]: I0130 08:10:44.973337 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:44Z","lastTransitionTime":"2026-01-30T08:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.000466 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.000507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.000525 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.000559 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000654 4870 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000692 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:11:49.000679655 +0000 UTC m=+147.696226774 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000824 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000835 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000844 4870 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.000864 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 08:11:49.000858571 +0000 UTC m=+147.696405680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001008 4870 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001035 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 08:11:49.001027946 +0000 UTC m=+147.696575055 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001076 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001107 4870 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001123 4870 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.001180 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 08:11:49.00116421 +0000 UTC m=+147.696711329 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.074223 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.074249 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.074226 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.074409 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.074215 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.074747 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.074940 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.075142 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.078006 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.078055 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.078073 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.078099 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.078117 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.083154 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 01:14:45.785929181 +0000 UTC Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.181584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.181657 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.181680 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.181709 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.181731 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.285317 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.285389 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.285413 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.285442 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.285466 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.389151 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.389217 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.389230 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.389251 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.389264 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.491893 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.491958 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.491973 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.491998 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.492015 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.594317 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.594356 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.594364 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.594379 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.594389 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.645218 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/3.log" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.650055 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:10:45 crc kubenswrapper[4870]: E0130 08:10:45.650219 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.663302 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hsmrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8e9e25-2b9b-4820-8282-48e1d930a721\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:35Z\\\",\\\"message\\\":\\\"2026-01-30T08:09:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431\\\\n2026-01-30T08:09:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_63624c98-b351-4f15-8ccd-8ea961785431 to /host/opt/cni/bin/\\\\n2026-01-30T08:09:50Z [verbose] multus-daemon started\\\\n2026-01-30T08:09:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T08:10:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7mcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hsmrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.674046 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a5c19ea869d0983bc1b4bad592a3d73b8640185351f260c21d3fb0ba97eac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hp4df\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j4sd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.684958 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dpj7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228f8bf9-7e75-4886-8441-57bc0d251413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872d591619cf7b2e3ef588c41d9d457887802dda965a2b87ef21b1ff023b29f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9dtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dpj7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.697299 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.697335 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.697346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.697363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.697374 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.706186 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26285d6c-2642-414d-9dc0-904549b87975\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd742081271df9e1741e1598e3017ae81c62a78f41203638f049a870e15f43cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3556e852240298ce97a0728d74120b9feec9bdab0d5450948ec31a7a64751973\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e03d4387439ba106308e50df06e5b0b9725e6f93fce8ea2e8159514951132572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd268ae9709b31196dd855a430d621ba1f27e1003438f09ff6c8f9d1c3164f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026760c59859da40f402d09c4912636bc87455cd90302b4bea238eeba0ff451e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22b1cbc698afc544b846c470ad20c60d804f11b6d03594bb539fcee97d547c03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://324a44e3853b8b314a5eef793617fcf46eae6ed12900e1006e78660a1aa04ca9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1233caf10c907a6e2694d76c367bbc12eb25378636b47aa4eced46222db68fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.720327 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7745d9bc-1d2f-4854-a0d9-0b487817c8d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d500cb8fa8550901863f5e814c6c992509f42e77808722ec63c9aa9de81fd673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f8c543f5b6d1542e63c6ec4417ada801afdeb2f979adb1bfa5aef7a4efa2a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6fc71b720ba0da0b93c2a74bb3f9ba338bd2206037cd1aa1e51d6cb5dd4065d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.753189 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.774619 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8kvt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1239efc2-d4e8-4a88-a0bf-00a685812999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://82c9470a761f33a3cb266f70865d69fae18e7a5946ca3d15365858d914d641a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzdvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8kvt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.792997 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd7f5e-1187-4760-b2dc-98c3d3286f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9d4f6f4f3d0bcb6476b76a9154e3cdf46867cb903cc02180043802658b6e14a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ebec6d66bdcb97ce260a1d7c16eb1b7db9b39b9a73b5f51476798bfba7c0fc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0747fc4e1ccfe703e05a090c3f1bc907110788691c6b772ce07e078091bc77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b4b67ac15165704d2807acc0575ae3504c3cf5073195e6e4323edd116925e63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://199f858676f482865c8bf20140fdae1afbda8fbccec75486883fa04da772369e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05ff9365b75b0e4851937cc8f4709b165562ed487f13c5ecf352597a1a35f54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bd4ef1a577999ab8134e6571f97a98adb00d3e27625748b7cff7df460683ef1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwsvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rrkfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.799074 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.799098 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.799107 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.799119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.799129 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.812716 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36037609-52f9-4c09-8beb-6d35a039347b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T08:10:44Z\\\",\\\"message\\\":\\\"ontroller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-cj5db\\\\nI0130 08:10:44.228252 6956 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0130 08:10:44.228262 6956 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0130 08:10:44.228249 6956 ovn.go:134] Ensuring zone local for Pod openshift-kube-scheduler/openshift-kube-scheduler-crc in node crc\\\\nI0130 08:10:44.228275 6956 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc after 0 failed attempt(s)\\\\nI0130 08:10:44.228279 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq\\\\nI0130 08:10:44.228289 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8\\\\nI0130 08:10:44.228296 6956 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/machine-config-daemon-j4sd8\\\\nI0130 08:10:44.228299 6956 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq\\\\nI0130 08:10:44.228311 6956 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/network-metrics-daemon-mp9vw\\\\nI0130 08:10:44.2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:10:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pk5ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cj5db\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.823654 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b66c5f2c-1e0e-4d09-ab12-8cd255f29aa8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe7ab8f2efaea80fb8f85a7d33f79d70eb848a65d1fc6dd1cacacc479c1065c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a87e50cd50d8225874491d677cc6961215e485310948984353dda93ab97eced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:10:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fglbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5xdq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.833245 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b976744-b72d-4291-a32f-437fc1cfbf03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx29n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:10:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mp9vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.845323 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8759add-f10c-406f-a1ee-1a4530b369b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:10:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb553cf7aed8c067e432248cd08fdd7db46f939c608b18c4e25737b8d115d52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11111845069f3d8452723ee5954d1f68e314872eda1d204abf733f4871a7ec40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb3033803f0aa71bcbc5737ac85911714c8cd9c224b2cb7813a3876e454e081f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://902d411d492e06565800c28dd1a711b56345ef091ae08543b17f283632d9176d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.859119 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T08:09:40Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 08:09:25.785738 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 08:09:25.787572 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1870838884/tls.crt::/tmp/serving-cert-1870838884/tls.key\\\\\\\"\\\\nI0130 08:09:40.862942 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 08:09:40.877393 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 08:09:40.877434 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 08:09:40.877467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 08:09:40.877474 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 08:09:40.887749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 08:09:40.887801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887810 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 08:09:40.887814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 08:09:40.887818 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 08:09:40.887821 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0130 08:09:40.887824 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0130 08:09:40.888404 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0130 08:09:40.890397 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T08:09:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T08:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T08:09:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.874216 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6658972f2f477703359c1ea09898f96fb1277b0210e37599d80f60cece3184e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f3fe7b04f825d7b37b1e1312ba6cf36ead423eb02de739ad0e8e1ec0a4dd715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.886143 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43d7a43b33ec817283e60b75e27d05e6050252a3b7a4db41c0b76052400301f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.899214 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.901007 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.901029 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.901038 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.901052 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.901061 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:45Z","lastTransitionTime":"2026-01-30T08:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.910957 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:45 crc kubenswrapper[4870]: I0130 08:10:45.922198 4870 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T08:09:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c80dd1abdfdcca0a0e1a53bacfd6288b58888da634a1d551d6d47fa60b7766ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T08:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T08:10:45Z is after 2025-08-24T17:21:41Z" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.003718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.003751 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.003759 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.003774 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.003784 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.083747 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:18:58.180615578 +0000 UTC Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.109729 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.109788 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.109800 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.109819 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.109837 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.214115 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.214175 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.214192 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.214216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.214232 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.316997 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.317043 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.317058 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.317079 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.317094 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.420094 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.420143 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.420159 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.420182 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.420199 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.522913 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.522965 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.522982 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.523006 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.523023 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.626269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.626340 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.626359 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.626382 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.626397 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.729163 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.729216 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.729236 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.729264 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.729283 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.832723 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.832776 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.832793 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.832815 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.832833 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.936119 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.936198 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.936211 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.936228 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:46 crc kubenswrapper[4870]: I0130 08:10:46.936238 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:46Z","lastTransitionTime":"2026-01-30T08:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.039193 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.039270 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.039283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.039306 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.039319 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.073946 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.074017 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.074037 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.074088 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:47 crc kubenswrapper[4870]: E0130 08:10:47.074148 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:47 crc kubenswrapper[4870]: E0130 08:10:47.074369 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:47 crc kubenswrapper[4870]: E0130 08:10:47.074520 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:47 crc kubenswrapper[4870]: E0130 08:10:47.074630 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.084231 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 14:25:52.741342434 +0000 UTC Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.142283 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.142323 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.142332 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.142365 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.142378 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.244759 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.244826 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.244848 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.244895 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.244914 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.348343 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.348428 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.348441 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.348460 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.348474 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.452004 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.452051 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.452062 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.452083 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.452099 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.556133 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.556210 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.556226 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.556253 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.556271 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.659124 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.659177 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.659191 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.659208 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.659222 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.762269 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.762325 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.762342 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.762363 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.762377 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.865012 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.865088 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.865101 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.865125 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.865140 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.967833 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.967886 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.967894 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.967906 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:47 crc kubenswrapper[4870]: I0130 08:10:47.967916 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:47Z","lastTransitionTime":"2026-01-30T08:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.070344 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.070394 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.070410 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.070433 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.070451 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.085234 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:03:34.626757545 +0000 UTC Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.173588 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.173625 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.173632 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.173646 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.173655 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.276532 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.276590 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.276613 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.276662 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.276695 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.380564 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.380634 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.380656 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.380682 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.380703 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.483584 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.483655 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.483676 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.483700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.483713 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.586671 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.586708 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.586718 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.586735 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.586747 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.689455 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.689499 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.689511 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.689527 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.689539 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.792659 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.792700 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.792711 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.792726 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.792737 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.895048 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.895116 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.895140 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.895162 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:48 crc kubenswrapper[4870]: I0130 08:10:48.895178 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:48.998507 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:48.998559 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:48.998634 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:48.998651 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:48.998667 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:48Z","lastTransitionTime":"2026-01-30T08:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.073541 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.073588 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.073622 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:49 crc kubenswrapper[4870]: E0130 08:10:49.073752 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.073791 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:49 crc kubenswrapper[4870]: E0130 08:10:49.073917 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:49 crc kubenswrapper[4870]: E0130 08:10:49.074070 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:49 crc kubenswrapper[4870]: E0130 08:10:49.074208 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.085955 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 20:07:08.930196246 +0000 UTC Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.101222 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.101575 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.101768 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.102016 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.102234 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:49Z","lastTransitionTime":"2026-01-30T08:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.205604 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.205652 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.205663 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.205686 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.205701 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:49Z","lastTransitionTime":"2026-01-30T08:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.229209 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.229346 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.229461 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.229655 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.229750 4870 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T08:10:49Z","lastTransitionTime":"2026-01-30T08:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.296584 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts"] Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.297283 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.300041 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.303711 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.304116 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.304700 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.370202 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rrkfz" podStartSLOduration=62.370160133 podStartE2EDuration="1m2.370160133s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.338845741 +0000 UTC m=+88.034392850" watchObservedRunningTime="2026-01-30 08:10:49.370160133 +0000 UTC m=+88.065707242" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.384018 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5xdq" podStartSLOduration=62.383990986 podStartE2EDuration="1m2.383990986s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.383338746 +0000 UTC m=+88.078885865" watchObservedRunningTime="2026-01-30 08:10:49.383990986 +0000 UTC m=+88.079538095" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.429151 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.429134622 podStartE2EDuration="36.429134622s" podCreationTimestamp="2026-01-30 08:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.429106391 +0000 UTC m=+88.124653500" watchObservedRunningTime="2026-01-30 08:10:49.429134622 +0000 UTC m=+88.124681731" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.447033 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.447004153 podStartE2EDuration="1m8.447004153s" podCreationTimestamp="2026-01-30 08:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.4465923 +0000 UTC m=+88.142139429" watchObservedRunningTime="2026-01-30 08:10:49.447004153 +0000 UTC m=+88.142551262" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.455397 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.455462 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.455503 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee0907b6-d3b0-44d3-b153-a06bd6922390-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.455534 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0907b6-d3b0-44d3-b153-a06bd6922390-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.455553 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee0907b6-d3b0-44d3-b153-a06bd6922390-service-ca\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.530717 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8kvt7" podStartSLOduration=62.530682737 podStartE2EDuration="1m2.530682737s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.529457168 +0000 UTC m=+88.225004287" watchObservedRunningTime="2026-01-30 08:10:49.530682737 +0000 UTC m=+88.226229846" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.545734 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hsmrb" podStartSLOduration=62.545712348 podStartE2EDuration="1m2.545712348s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.545610805 +0000 UTC m=+88.241157924" watchObservedRunningTime="2026-01-30 08:10:49.545712348 +0000 UTC m=+88.241259457" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.556792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557174 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557014 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ee0907b6-d3b0-44d3-b153-a06bd6922390-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557193 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee0907b6-d3b0-44d3-b153-a06bd6922390-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557434 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0907b6-d3b0-44d3-b153-a06bd6922390-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.557490 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee0907b6-d3b0-44d3-b153-a06bd6922390-service-ca\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.558436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee0907b6-d3b0-44d3-b153-a06bd6922390-service-ca\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.560923 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podStartSLOduration=62.560908135 podStartE2EDuration="1m2.560908135s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.560422509 +0000 UTC m=+88.255969648" watchObservedRunningTime="2026-01-30 08:10:49.560908135 +0000 UTC m=+88.256455244" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.572605 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0907b6-d3b0-44d3-b153-a06bd6922390-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.577178 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee0907b6-d3b0-44d3-b153-a06bd6922390-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-np8ts\" (UID: \"ee0907b6-d3b0-44d3-b153-a06bd6922390\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.609794 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.609769586 podStartE2EDuration="1m8.609769586s" podCreationTimestamp="2026-01-30 08:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.608246869 +0000 UTC m=+88.303793978" watchObservedRunningTime="2026-01-30 08:10:49.609769586 +0000 UTC m=+88.305316685" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.612257 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dpj7j" podStartSLOduration=62.612224234 podStartE2EDuration="1m2.612224234s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.582626006 +0000 UTC m=+88.278173135" watchObservedRunningTime="2026-01-30 08:10:49.612224234 +0000 UTC m=+88.307771353" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.615095 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.629642 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.629570538 podStartE2EDuration="1m7.629570538s" podCreationTimestamp="2026-01-30 08:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:49.627055179 +0000 UTC m=+88.322602288" watchObservedRunningTime="2026-01-30 08:10:49.629570538 +0000 UTC m=+88.325117647" Jan 30 08:10:49 crc kubenswrapper[4870]: W0130 08:10:49.632514 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee0907b6_d3b0_44d3_b153_a06bd6922390.slice/crio-4ec29929bd8f01f0b4fa59f1ffb310c74c4006f44dbf7e13c2704860e0a334b6 WatchSource:0}: Error finding container 4ec29929bd8f01f0b4fa59f1ffb310c74c4006f44dbf7e13c2704860e0a334b6: Status 404 returned error can't find the container with id 4ec29929bd8f01f0b4fa59f1ffb310c74c4006f44dbf7e13c2704860e0a334b6 Jan 30 08:10:49 crc kubenswrapper[4870]: I0130 08:10:49.662499 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" event={"ID":"ee0907b6-d3b0-44d3-b153-a06bd6922390","Type":"ContainerStarted","Data":"4ec29929bd8f01f0b4fa59f1ffb310c74c4006f44dbf7e13c2704860e0a334b6"} Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.087232 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 22:20:20.048701557 +0000 UTC Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.087696 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.087513 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.094479 4870 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.666657 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" event={"ID":"ee0907b6-d3b0-44d3-b153-a06bd6922390","Type":"ContainerStarted","Data":"cd73d9c53fc6d9653ff691866e80fec8f89ff677dd18e02d2a07d7c56a5901bc"} Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.700666 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.700640347 podStartE2EDuration="700.640347ms" podCreationTimestamp="2026-01-30 08:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:50.68192708 +0000 UTC m=+89.377474229" watchObservedRunningTime="2026-01-30 08:10:50.700640347 +0000 UTC m=+89.396187496" Jan 30 08:10:50 crc kubenswrapper[4870]: I0130 08:10:50.701625 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-np8ts" podStartSLOduration=63.701614147 podStartE2EDuration="1m3.701614147s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:10:50.700335427 +0000 UTC m=+89.395882596" watchObservedRunningTime="2026-01-30 08:10:50.701614147 +0000 UTC m=+89.397161286" Jan 30 08:10:51 crc kubenswrapper[4870]: I0130 08:10:51.074326 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:51 crc kubenswrapper[4870]: I0130 08:10:51.074431 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:51 crc kubenswrapper[4870]: E0130 08:10:51.074477 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:51 crc kubenswrapper[4870]: I0130 08:10:51.074436 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:51 crc kubenswrapper[4870]: E0130 08:10:51.074563 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:51 crc kubenswrapper[4870]: I0130 08:10:51.074596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:51 crc kubenswrapper[4870]: E0130 08:10:51.074696 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:51 crc kubenswrapper[4870]: E0130 08:10:51.074781 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:53 crc kubenswrapper[4870]: I0130 08:10:53.074034 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:53 crc kubenswrapper[4870]: I0130 08:10:53.074060 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:53 crc kubenswrapper[4870]: I0130 08:10:53.074036 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:53 crc kubenswrapper[4870]: I0130 08:10:53.074108 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:53 crc kubenswrapper[4870]: E0130 08:10:53.074248 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:53 crc kubenswrapper[4870]: E0130 08:10:53.074312 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:53 crc kubenswrapper[4870]: E0130 08:10:53.074379 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:53 crc kubenswrapper[4870]: E0130 08:10:53.074438 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:55 crc kubenswrapper[4870]: I0130 08:10:55.074001 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:55 crc kubenswrapper[4870]: I0130 08:10:55.074053 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:55 crc kubenswrapper[4870]: I0130 08:10:55.074166 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:55 crc kubenswrapper[4870]: E0130 08:10:55.074183 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:55 crc kubenswrapper[4870]: I0130 08:10:55.074213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:55 crc kubenswrapper[4870]: E0130 08:10:55.074311 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:55 crc kubenswrapper[4870]: E0130 08:10:55.074401 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:55 crc kubenswrapper[4870]: E0130 08:10:55.074460 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:56 crc kubenswrapper[4870]: I0130 08:10:56.075596 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:10:56 crc kubenswrapper[4870]: E0130 08:10:56.076139 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:10:57 crc kubenswrapper[4870]: I0130 08:10:57.074489 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:57 crc kubenswrapper[4870]: E0130 08:10:57.074636 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:10:57 crc kubenswrapper[4870]: I0130 08:10:57.074672 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:57 crc kubenswrapper[4870]: E0130 08:10:57.074780 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:57 crc kubenswrapper[4870]: I0130 08:10:57.074924 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:57 crc kubenswrapper[4870]: I0130 08:10:57.074938 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:57 crc kubenswrapper[4870]: E0130 08:10:57.074967 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:57 crc kubenswrapper[4870]: E0130 08:10:57.075142 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:59 crc kubenswrapper[4870]: I0130 08:10:59.073517 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:10:59 crc kubenswrapper[4870]: E0130 08:10:59.073933 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:10:59 crc kubenswrapper[4870]: I0130 08:10:59.073596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:10:59 crc kubenswrapper[4870]: I0130 08:10:59.073596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:10:59 crc kubenswrapper[4870]: E0130 08:10:59.074005 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:10:59 crc kubenswrapper[4870]: I0130 08:10:59.073630 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:10:59 crc kubenswrapper[4870]: E0130 08:10:59.074160 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:10:59 crc kubenswrapper[4870]: E0130 08:10:59.074270 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:01 crc kubenswrapper[4870]: I0130 08:11:01.073986 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:01 crc kubenswrapper[4870]: I0130 08:11:01.074021 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:01 crc kubenswrapper[4870]: I0130 08:11:01.074098 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:01 crc kubenswrapper[4870]: E0130 08:11:01.074184 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:01 crc kubenswrapper[4870]: I0130 08:11:01.074224 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:01 crc kubenswrapper[4870]: E0130 08:11:01.074298 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:01 crc kubenswrapper[4870]: E0130 08:11:01.074397 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:01 crc kubenswrapper[4870]: E0130 08:11:01.074447 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:03 crc kubenswrapper[4870]: I0130 08:11:03.074463 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:03 crc kubenswrapper[4870]: E0130 08:11:03.075319 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:03 crc kubenswrapper[4870]: I0130 08:11:03.074547 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:03 crc kubenswrapper[4870]: I0130 08:11:03.074602 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:03 crc kubenswrapper[4870]: E0130 08:11:03.075718 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:03 crc kubenswrapper[4870]: I0130 08:11:03.074514 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:03 crc kubenswrapper[4870]: E0130 08:11:03.075862 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:03 crc kubenswrapper[4870]: E0130 08:11:03.076353 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:05 crc kubenswrapper[4870]: I0130 08:11:05.074732 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:05 crc kubenswrapper[4870]: I0130 08:11:05.074739 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.074974 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:05 crc kubenswrapper[4870]: I0130 08:11:05.074772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:05 crc kubenswrapper[4870]: I0130 08:11:05.074726 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.075200 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.075233 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.075295 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:05 crc kubenswrapper[4870]: I0130 08:11:05.548985 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.549188 4870 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:11:05 crc kubenswrapper[4870]: E0130 08:11:05.549289 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs podName:7b976744-b72d-4291-a32f-437fc1cfbf03 nodeName:}" failed. No retries permitted until 2026-01-30 08:12:09.549265883 +0000 UTC m=+168.244813172 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs") pod "network-metrics-daemon-mp9vw" (UID: "7b976744-b72d-4291-a32f-437fc1cfbf03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 08:11:07 crc kubenswrapper[4870]: I0130 08:11:07.074513 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:07 crc kubenswrapper[4870]: I0130 08:11:07.074558 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:07 crc kubenswrapper[4870]: I0130 08:11:07.074589 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:07 crc kubenswrapper[4870]: E0130 08:11:07.074661 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:07 crc kubenswrapper[4870]: I0130 08:11:07.074690 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:07 crc kubenswrapper[4870]: E0130 08:11:07.074775 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:07 crc kubenswrapper[4870]: E0130 08:11:07.075198 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:07 crc kubenswrapper[4870]: E0130 08:11:07.075175 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:08 crc kubenswrapper[4870]: I0130 08:11:08.075646 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:11:08 crc kubenswrapper[4870]: E0130 08:11:08.076000 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:11:09 crc kubenswrapper[4870]: I0130 08:11:09.074273 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:09 crc kubenswrapper[4870]: I0130 08:11:09.074355 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:09 crc kubenswrapper[4870]: I0130 08:11:09.074379 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:09 crc kubenswrapper[4870]: E0130 08:11:09.074467 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:09 crc kubenswrapper[4870]: I0130 08:11:09.074508 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:09 crc kubenswrapper[4870]: E0130 08:11:09.074583 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:09 crc kubenswrapper[4870]: E0130 08:11:09.074687 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:09 crc kubenswrapper[4870]: E0130 08:11:09.074807 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:11 crc kubenswrapper[4870]: I0130 08:11:11.073564 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:11 crc kubenswrapper[4870]: I0130 08:11:11.073602 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:11 crc kubenswrapper[4870]: I0130 08:11:11.073613 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:11 crc kubenswrapper[4870]: I0130 08:11:11.073753 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:11 crc kubenswrapper[4870]: E0130 08:11:11.073907 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:11 crc kubenswrapper[4870]: E0130 08:11:11.074138 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:11 crc kubenswrapper[4870]: E0130 08:11:11.074279 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:11 crc kubenswrapper[4870]: E0130 08:11:11.074362 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:13 crc kubenswrapper[4870]: I0130 08:11:13.073696 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:13 crc kubenswrapper[4870]: I0130 08:11:13.073701 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:13 crc kubenswrapper[4870]: I0130 08:11:13.073741 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:13 crc kubenswrapper[4870]: I0130 08:11:13.073707 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:13 crc kubenswrapper[4870]: E0130 08:11:13.074065 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:13 crc kubenswrapper[4870]: E0130 08:11:13.074189 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:13 crc kubenswrapper[4870]: E0130 08:11:13.074359 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:13 crc kubenswrapper[4870]: E0130 08:11:13.074555 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:15 crc kubenswrapper[4870]: I0130 08:11:15.074666 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:15 crc kubenswrapper[4870]: I0130 08:11:15.074664 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:15 crc kubenswrapper[4870]: I0130 08:11:15.074716 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:15 crc kubenswrapper[4870]: E0130 08:11:15.074921 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:15 crc kubenswrapper[4870]: I0130 08:11:15.074993 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:15 crc kubenswrapper[4870]: E0130 08:11:15.075063 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:15 crc kubenswrapper[4870]: E0130 08:11:15.075160 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:15 crc kubenswrapper[4870]: E0130 08:11:15.075499 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:17 crc kubenswrapper[4870]: I0130 08:11:17.073808 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:17 crc kubenswrapper[4870]: I0130 08:11:17.073848 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:17 crc kubenswrapper[4870]: I0130 08:11:17.073955 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:17 crc kubenswrapper[4870]: E0130 08:11:17.075092 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:17 crc kubenswrapper[4870]: E0130 08:11:17.075221 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:17 crc kubenswrapper[4870]: I0130 08:11:17.073988 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:17 crc kubenswrapper[4870]: E0130 08:11:17.075444 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:17 crc kubenswrapper[4870]: E0130 08:11:17.075795 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:19 crc kubenswrapper[4870]: I0130 08:11:19.074037 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:19 crc kubenswrapper[4870]: I0130 08:11:19.074176 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:19 crc kubenswrapper[4870]: I0130 08:11:19.074208 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:19 crc kubenswrapper[4870]: E0130 08:11:19.074362 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:19 crc kubenswrapper[4870]: E0130 08:11:19.074566 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:19 crc kubenswrapper[4870]: E0130 08:11:19.074621 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:19 crc kubenswrapper[4870]: I0130 08:11:19.075015 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:19 crc kubenswrapper[4870]: E0130 08:11:19.075243 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.074763 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.074805 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.074839 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.074962 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.075038 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.075125 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.075237 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.075374 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.077650 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.078109 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cj5db_openshift-ovn-kubernetes(36037609-52f9-4c09-8beb-6d35a039347b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.870657 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/1.log" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.871666 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/0.log" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.871738 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e8e9e25-2b9b-4820-8282-48e1d930a721" containerID="e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6" exitCode=1 Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.871787 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerDied","Data":"e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6"} Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.871838 4870 scope.go:117] "RemoveContainer" containerID="f51ff6706b07d9e1323a28f982e208fa5e28cf7193da7addb0bb82616f72aa1a" Jan 30 08:11:21 crc kubenswrapper[4870]: I0130 08:11:21.872672 4870 scope.go:117] "RemoveContainer" containerID="e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6" Jan 30 08:11:21 crc kubenswrapper[4870]: E0130 08:11:21.873041 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hsmrb_openshift-multus(3e8e9e25-2b9b-4820-8282-48e1d930a721)\"" pod="openshift-multus/multus-hsmrb" podUID="3e8e9e25-2b9b-4820-8282-48e1d930a721" Jan 30 08:11:22 crc kubenswrapper[4870]: E0130 08:11:22.044155 4870 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 30 08:11:22 crc kubenswrapper[4870]: E0130 08:11:22.181647 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 08:11:22 crc kubenswrapper[4870]: I0130 08:11:22.879595 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/1.log" Jan 30 08:11:23 crc kubenswrapper[4870]: I0130 08:11:23.074619 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:23 crc kubenswrapper[4870]: I0130 08:11:23.074746 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:23 crc kubenswrapper[4870]: I0130 08:11:23.074918 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:23 crc kubenswrapper[4870]: I0130 08:11:23.075151 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:23 crc kubenswrapper[4870]: E0130 08:11:23.075125 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:23 crc kubenswrapper[4870]: E0130 08:11:23.075398 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:23 crc kubenswrapper[4870]: E0130 08:11:23.075451 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:23 crc kubenswrapper[4870]: E0130 08:11:23.075552 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:25 crc kubenswrapper[4870]: I0130 08:11:25.073969 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:25 crc kubenswrapper[4870]: I0130 08:11:25.074043 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:25 crc kubenswrapper[4870]: I0130 08:11:25.074058 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:25 crc kubenswrapper[4870]: I0130 08:11:25.074059 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:25 crc kubenswrapper[4870]: E0130 08:11:25.074241 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:25 crc kubenswrapper[4870]: E0130 08:11:25.074373 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:25 crc kubenswrapper[4870]: E0130 08:11:25.074481 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:25 crc kubenswrapper[4870]: E0130 08:11:25.074598 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:27 crc kubenswrapper[4870]: I0130 08:11:27.074136 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:27 crc kubenswrapper[4870]: I0130 08:11:27.074160 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:27 crc kubenswrapper[4870]: E0130 08:11:27.074359 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:27 crc kubenswrapper[4870]: I0130 08:11:27.074163 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:27 crc kubenswrapper[4870]: E0130 08:11:27.074562 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:27 crc kubenswrapper[4870]: E0130 08:11:27.074750 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:27 crc kubenswrapper[4870]: I0130 08:11:27.074954 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:27 crc kubenswrapper[4870]: E0130 08:11:27.075062 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:27 crc kubenswrapper[4870]: E0130 08:11:27.183840 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 08:11:29 crc kubenswrapper[4870]: I0130 08:11:29.073597 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:29 crc kubenswrapper[4870]: I0130 08:11:29.073624 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:29 crc kubenswrapper[4870]: E0130 08:11:29.074298 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:29 crc kubenswrapper[4870]: I0130 08:11:29.073767 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:29 crc kubenswrapper[4870]: I0130 08:11:29.073732 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:29 crc kubenswrapper[4870]: E0130 08:11:29.074375 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:29 crc kubenswrapper[4870]: E0130 08:11:29.074487 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:29 crc kubenswrapper[4870]: E0130 08:11:29.074663 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:31 crc kubenswrapper[4870]: I0130 08:11:31.074058 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:31 crc kubenswrapper[4870]: I0130 08:11:31.074092 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:31 crc kubenswrapper[4870]: I0130 08:11:31.074071 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:31 crc kubenswrapper[4870]: E0130 08:11:31.074217 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:31 crc kubenswrapper[4870]: I0130 08:11:31.074364 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:31 crc kubenswrapper[4870]: E0130 08:11:31.074431 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:31 crc kubenswrapper[4870]: E0130 08:11:31.074475 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:31 crc kubenswrapper[4870]: E0130 08:11:31.074538 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:32 crc kubenswrapper[4870]: E0130 08:11:32.184489 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 08:11:33 crc kubenswrapper[4870]: I0130 08:11:33.073692 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:33 crc kubenswrapper[4870]: I0130 08:11:33.073757 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:33 crc kubenswrapper[4870]: I0130 08:11:33.073692 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:33 crc kubenswrapper[4870]: I0130 08:11:33.073864 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:33 crc kubenswrapper[4870]: E0130 08:11:33.073984 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:33 crc kubenswrapper[4870]: E0130 08:11:33.074144 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:33 crc kubenswrapper[4870]: E0130 08:11:33.074250 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:33 crc kubenswrapper[4870]: E0130 08:11:33.074307 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.075215 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.926601 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/3.log" Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.930229 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerStarted","Data":"b741960d899fead07c73e8ea4b750a10bd019b223fe9d09e7a67a573f3e4bee3"} Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.930709 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.959329 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podStartSLOduration=107.95930847 podStartE2EDuration="1m47.95930847s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:34.95765572 +0000 UTC m=+133.653202849" watchObservedRunningTime="2026-01-30 08:11:34.95930847 +0000 UTC m=+133.654855589" Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.993442 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mp9vw"] Jan 30 08:11:34 crc kubenswrapper[4870]: I0130 08:11:34.993653 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:34 crc kubenswrapper[4870]: E0130 08:11:34.993773 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:35 crc kubenswrapper[4870]: I0130 08:11:35.074301 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:35 crc kubenswrapper[4870]: I0130 08:11:35.074382 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:35 crc kubenswrapper[4870]: E0130 08:11:35.074427 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:35 crc kubenswrapper[4870]: I0130 08:11:35.074617 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:35 crc kubenswrapper[4870]: E0130 08:11:35.074603 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:35 crc kubenswrapper[4870]: E0130 08:11:35.074671 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:36 crc kubenswrapper[4870]: I0130 08:11:36.074708 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:36 crc kubenswrapper[4870]: E0130 08:11:36.074924 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.074500 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.074591 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.074618 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:37 crc kubenswrapper[4870]: E0130 08:11:37.074686 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:37 crc kubenswrapper[4870]: E0130 08:11:37.074896 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:37 crc kubenswrapper[4870]: E0130 08:11:37.075060 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.075550 4870 scope.go:117] "RemoveContainer" containerID="e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6" Jan 30 08:11:37 crc kubenswrapper[4870]: E0130 08:11:37.185653 4870 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.945382 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/1.log" Jan 30 08:11:37 crc kubenswrapper[4870]: I0130 08:11:37.945454 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerStarted","Data":"61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41"} Jan 30 08:11:38 crc kubenswrapper[4870]: I0130 08:11:38.074064 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:38 crc kubenswrapper[4870]: E0130 08:11:38.074315 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:39 crc kubenswrapper[4870]: I0130 08:11:39.074091 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:39 crc kubenswrapper[4870]: I0130 08:11:39.074161 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:39 crc kubenswrapper[4870]: E0130 08:11:39.074306 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:39 crc kubenswrapper[4870]: I0130 08:11:39.074606 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:39 crc kubenswrapper[4870]: E0130 08:11:39.074708 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:39 crc kubenswrapper[4870]: E0130 08:11:39.075077 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:40 crc kubenswrapper[4870]: I0130 08:11:40.074709 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:40 crc kubenswrapper[4870]: E0130 08:11:40.074980 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:41 crc kubenswrapper[4870]: I0130 08:11:41.073859 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:41 crc kubenswrapper[4870]: I0130 08:11:41.073898 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:41 crc kubenswrapper[4870]: E0130 08:11:41.074409 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 08:11:41 crc kubenswrapper[4870]: I0130 08:11:41.073932 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:41 crc kubenswrapper[4870]: E0130 08:11:41.074560 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 08:11:41 crc kubenswrapper[4870]: E0130 08:11:41.074759 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 08:11:42 crc kubenswrapper[4870]: I0130 08:11:42.075241 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:42 crc kubenswrapper[4870]: E0130 08:11:42.076011 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mp9vw" podUID="7b976744-b72d-4291-a32f-437fc1cfbf03" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.074027 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.074113 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.074039 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.076845 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.077002 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.077187 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 08:11:43 crc kubenswrapper[4870]: I0130 08:11:43.077645 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 08:11:44 crc kubenswrapper[4870]: I0130 08:11:44.074118 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:11:44 crc kubenswrapper[4870]: I0130 08:11:44.077783 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 08:11:44 crc kubenswrapper[4870]: I0130 08:11:44.077865 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 08:11:48 crc kubenswrapper[4870]: I0130 08:11:48.939506 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:48 crc kubenswrapper[4870]: E0130 08:11:48.939725 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:13:50.939683142 +0000 UTC m=+269.635230261 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.041349 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.041405 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.041431 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.041453 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.042642 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.048504 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.049300 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.050046 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.095648 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.110588 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:49 crc kubenswrapper[4870]: I0130 08:11:49.117530 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 08:11:49 crc kubenswrapper[4870]: W0130 08:11:49.341522 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-82e1818833b47c8b1e15c8e8b7294b1b88861e6ac1525cbba927937ed04a77fe WatchSource:0}: Error finding container 82e1818833b47c8b1e15c8e8b7294b1b88861e6ac1525cbba927937ed04a77fe: Status 404 returned error can't find the container with id 82e1818833b47c8b1e15c8e8b7294b1b88861e6ac1525cbba927937ed04a77fe Jan 30 08:11:49 crc kubenswrapper[4870]: W0130 08:11:49.349105 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-bb4b150b8a7ebc4a00e0d09cb44c52a1363567f24e299f34f5b786cfeb3e4a05 WatchSource:0}: Error finding container bb4b150b8a7ebc4a00e0d09cb44c52a1363567f24e299f34f5b786cfeb3e4a05: Status 404 returned error can't find the container with id bb4b150b8a7ebc4a00e0d09cb44c52a1363567f24e299f34f5b786cfeb3e4a05 Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.002564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cc95b5edd8b272888f6271d7358aded96654ea07635ccee9a35987508d35b43b"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.005539 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8e66ca15d727c788aba5896683ae5c42367011461eb3c263e4bb4623e8a053a2"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.005597 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7983ec701a449c7dd8cbee8a83e699b7c4adefed08d7718b1361dcd2ef740a69"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.005635 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"82e1818833b47c8b1e15c8e8b7294b1b88861e6ac1525cbba927937ed04a77fe"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.008364 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2bf75989a5a217dae0b7f0756ecc1b7808a21afc7e0a76e1d81a29445f78c450"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.008482 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bb4b150b8a7ebc4a00e0d09cb44c52a1363567f24e299f34f5b786cfeb3e4a05"} Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.008785 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.755505 4870 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.825260 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2g2tj"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.827113 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.829413 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.835557 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.835767 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.835992 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.835563 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.836464 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.838851 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.839228 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.841354 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.841618 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842051 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842230 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842278 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842509 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842691 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842248 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.842929 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.843051 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.843948 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.847298 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-v7bvt"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.847951 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.848036 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.848136 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.848260 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.850110 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.850948 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.851445 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.852207 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.856248 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.856539 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.856685 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.856758 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.861317 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.862082 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.862183 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdzxd"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.862994 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.863855 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jr94b"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.864815 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.864944 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.865506 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.868945 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.869672 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.870253 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.870297 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.870484 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssxgx"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.871022 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.871114 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.874116 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.874788 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.879222 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.879464 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.879558 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.879776 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.880425 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.883934 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s6768"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.884744 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.885312 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.885547 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.885635 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886020 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886247 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886335 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886020 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886433 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886335 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886257 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886739 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886796 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.886815 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.889455 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.889838 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.890244 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.890413 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.890510 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.890661 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.890951 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.891423 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.891527 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.891950 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.892399 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.906572 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.911250 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.914157 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.915021 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.915442 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.920771 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.923384 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.924500 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.929052 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.929206 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.929363 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.929367 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.960866 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.961135 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.961214 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.962726 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963028 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963110 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963276 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963477 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963644 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.963773 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.964039 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.964105 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.964264 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966082 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-serving-cert\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966115 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-serving-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966141 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62vw\" (UniqueName: \"kubernetes.io/projected/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-kube-api-access-l62vw\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966163 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpw6w\" (UniqueName: \"kubernetes.io/projected/042ed63b-a1a9-4072-ae87-71b9fb98280c-kube-api-access-lpw6w\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966192 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-image-import-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966212 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-audit\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8bd\" (UniqueName: \"kubernetes.io/projected/b2cd7eb7-87cb-44dc-a01f-17985460c12c-kube-api-access-8x8bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966246 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-client\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966265 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pjr\" (UniqueName: \"kubernetes.io/projected/15eddd48-9a41-41cb-a284-80d01c7f8aad-kube-api-access-25pjr\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966303 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-config\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966322 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-images\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966338 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966465 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966546 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2cd7eb7-87cb-44dc-a01f-17985460c12c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966579 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ae1460-1e39-4d11-9357-3e0111521a8e-metrics-tls\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-node-pullsecrets\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966662 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnb8\" (UniqueName: \"kubernetes.io/projected/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-kube-api-access-rgnb8\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966694 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966702 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966714 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966732 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739bcba5-d8ef-45fe-abf9-02d74d0d093c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966752 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-audit-dir\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966784 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739bcba5-d8ef-45fe-abf9-02d74d0d093c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966802 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966806 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/15eddd48-9a41-41cb-a284-80d01c7f8aad-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966827 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-encryption-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966849 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-config\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966864 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739bcba5-d8ef-45fe-abf9-02d74d0d093c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966911 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2cd7eb7-87cb-44dc-a01f-17985460c12c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966935 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/042ed63b-a1a9-4072-ae87-71b9fb98280c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.966966 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnn7\" (UniqueName: \"kubernetes.io/projected/41ae1460-1e39-4d11-9357-3e0111521a8e-kube-api-access-xtnn7\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967005 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-trusted-ca\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967037 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66fq7\" (UniqueName: \"kubernetes.io/projected/0373f9a1-1537-4f29-905a-b0fb2affc113-kube-api-access-66fq7\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967079 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-serving-cert\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967096 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15eddd48-9a41-41cb-a284-80d01c7f8aad-serving-cert\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967160 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967284 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967347 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967444 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967574 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967708 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967821 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.967978 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968064 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968182 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968295 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968595 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968710 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968894 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.968999 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969098 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969191 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969337 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969449 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969650 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969690 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.969894 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.970038 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.970146 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.970249 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.970349 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.972311 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.976540 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.977197 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.977348 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g6x2r"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.977953 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l6p59"] Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.978422 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.979189 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.979408 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.980627 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.981397 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.990552 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.995380 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.998199 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 08:11:50 crc kubenswrapper[4870]: I0130 08:11:50.998214 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.001111 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.002769 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dfwzs"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.003582 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2l7mq"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.023799 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.023899 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.023963 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.025111 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.026658 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.026940 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.028312 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.029005 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.029638 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.032775 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2g2tj"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.036950 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050564 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050612 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050686 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050564 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050571 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.050835 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.051175 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.053937 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.054568 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.054903 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.055016 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.055658 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.056047 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.056667 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.056697 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.056709 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.056866 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.057454 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.059023 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.060181 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.062498 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.063166 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssxgx"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.063244 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.065630 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.068995 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069548 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-client\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069627 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069653 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pjr\" (UniqueName: \"kubernetes.io/projected/15eddd48-9a41-41cb-a284-80d01c7f8aad-kube-api-access-25pjr\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069676 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-config\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069707 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-images\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069727 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069757 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2cd7eb7-87cb-44dc-a01f-17985460c12c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069777 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ae1460-1e39-4d11-9357-3e0111521a8e-metrics-tls\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069796 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-node-pullsecrets\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069819 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnb8\" (UniqueName: \"kubernetes.io/projected/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-kube-api-access-rgnb8\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069837 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069857 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069888 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739bcba5-d8ef-45fe-abf9-02d74d0d093c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069908 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-audit-dir\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069935 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739bcba5-d8ef-45fe-abf9-02d74d0d093c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069957 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/15eddd48-9a41-41cb-a284-80d01c7f8aad-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069974 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-encryption-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.069996 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-config\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070015 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739bcba5-d8ef-45fe-abf9-02d74d0d093c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070037 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2cd7eb7-87cb-44dc-a01f-17985460c12c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070067 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/042ed63b-a1a9-4072-ae87-71b9fb98280c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070088 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnn7\" (UniqueName: \"kubernetes.io/projected/41ae1460-1e39-4d11-9357-3e0111521a8e-kube-api-access-xtnn7\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070108 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-trusted-ca\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070131 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66fq7\" (UniqueName: \"kubernetes.io/projected/0373f9a1-1537-4f29-905a-b0fb2affc113-kube-api-access-66fq7\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070155 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-serving-cert\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070176 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15eddd48-9a41-41cb-a284-80d01c7f8aad-serving-cert\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-serving-cert\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-serving-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070246 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62vw\" (UniqueName: \"kubernetes.io/projected/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-kube-api-access-l62vw\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.070267 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpw6w\" (UniqueName: \"kubernetes.io/projected/042ed63b-a1a9-4072-ae87-71b9fb98280c-kube-api-access-lpw6w\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.071029 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-image-import-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.071051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-audit\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.071074 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8bd\" (UniqueName: \"kubernetes.io/projected/b2cd7eb7-87cb-44dc-a01f-17985460c12c-kube-api-access-8x8bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.071553 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.071604 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.072180 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-node-pullsecrets\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.072206 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.073459 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.073521 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-images\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.074347 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.074373 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s6768"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.075402 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2cd7eb7-87cb-44dc-a01f-17985460c12c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.075710 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-config\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.075937 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0373f9a1-1537-4f29-905a-b0fb2affc113-audit-dir\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.076751 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739bcba5-d8ef-45fe-abf9-02d74d0d093c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.076825 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.076967 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/15eddd48-9a41-41cb-a284-80d01c7f8aad-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.077762 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-serving-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.079008 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-image-import-ca\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.079364 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-encryption-config\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.079529 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0373f9a1-1537-4f29-905a-b0fb2affc113-audit\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.081338 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.082911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042ed63b-a1a9-4072-ae87-71b9fb98280c-config\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.083265 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mnsdp"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.083486 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-trusted-ca\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.085365 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.085528 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.086112 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdzxd"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.086440 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739bcba5-d8ef-45fe-abf9-02d74d0d093c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.089950 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-serving-cert\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.089973 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ae1460-1e39-4d11-9357-3e0111521a8e-metrics-tls\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.090850 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.091316 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.093184 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15eddd48-9a41-41cb-a284-80d01c7f8aad-serving-cert\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.093633 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2cd7eb7-87cb-44dc-a01f-17985460c12c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.094932 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jr94b"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.097720 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v7bvt"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.102866 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/042ed63b-a1a9-4072-ae87-71b9fb98280c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.109084 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-serving-cert\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.110974 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0373f9a1-1537-4f29-905a-b0fb2affc113-etcd-client\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.112484 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.115552 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.133383 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.137420 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.138659 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.141321 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.143131 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l6p59"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.143254 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.143744 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2l7mq"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.145718 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.147060 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.148328 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.149148 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.150814 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.153153 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.153311 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.163105 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7wnt"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.166190 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.167757 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.170925 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.172669 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.175266 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.179934 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.181734 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.182978 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.183976 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7wnt"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.185638 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.186285 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.187169 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g6x2r"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.189477 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.202217 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mnsdp"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.203421 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.204827 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-szhwx"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.206759 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pm4xm"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.206982 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.208424 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.209221 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-szhwx"] Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.209290 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.216729 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.231424 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.250324 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.270262 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.310370 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.330240 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.349562 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.370313 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.410249 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.429474 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.450344 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.469739 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.490492 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.511291 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.530249 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.550413 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.570162 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.590785 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.611551 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.632006 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.651066 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.671024 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.691287 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.716749 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.731762 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.752203 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.770329 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.790952 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.811324 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.831054 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.850214 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.870843 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.890657 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.910977 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.929421 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.951195 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.971602 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 08:11:51 crc kubenswrapper[4870]: I0130 08:11:51.991362 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.011940 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.032142 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.049061 4870 request.go:700] Waited for 1.018763456s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serviceaccount-dockercfg-rq7zk&limit=500&resourceVersion=0 Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.051806 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.071412 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.091108 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.109454 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.139943 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.150747 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.172033 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.190638 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.211149 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.230963 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.251410 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.271185 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.290057 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.310516 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.331344 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.350558 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.371152 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.390519 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.410044 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.431692 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.451046 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.470660 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.491357 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.519670 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.530824 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.551116 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.570701 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.591262 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.610905 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.656353 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8bd\" (UniqueName: \"kubernetes.io/projected/b2cd7eb7-87cb-44dc-a01f-17985460c12c-kube-api-access-8x8bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-6lcwc\" (UID: \"b2cd7eb7-87cb-44dc-a01f-17985460c12c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.672128 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnb8\" (UniqueName: \"kubernetes.io/projected/b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa-kube-api-access-rgnb8\") pod \"console-operator-58897d9998-cdzxd\" (UID: \"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa\") " pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.690455 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739bcba5-d8ef-45fe-abf9-02d74d0d093c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7t7rr\" (UID: \"739bcba5-d8ef-45fe-abf9-02d74d0d093c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.699772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.713492 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62vw\" (UniqueName: \"kubernetes.io/projected/b41bf206-4b95-49db-85b6-2e5fe6dcc5ef-kube-api-access-l62vw\") pod \"kube-storage-version-migrator-operator-b67b599dd-mtt79\" (UID: \"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.735426 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpw6w\" (UniqueName: \"kubernetes.io/projected/042ed63b-a1a9-4072-ae87-71b9fb98280c-kube-api-access-lpw6w\") pod \"machine-api-operator-5694c8668f-jr94b\" (UID: \"042ed63b-a1a9-4072-ae87-71b9fb98280c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.761009 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66fq7\" (UniqueName: \"kubernetes.io/projected/0373f9a1-1537-4f29-905a-b0fb2affc113-kube-api-access-66fq7\") pod \"apiserver-76f77b778f-2g2tj\" (UID: \"0373f9a1-1537-4f29-905a-b0fb2affc113\") " pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.762517 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.781120 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnn7\" (UniqueName: \"kubernetes.io/projected/41ae1460-1e39-4d11-9357-3e0111521a8e-kube-api-access-xtnn7\") pod \"dns-operator-744455d44c-s6768\" (UID: \"41ae1460-1e39-4d11-9357-3e0111521a8e\") " pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.791920 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.802022 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pjr\" (UniqueName: \"kubernetes.io/projected/15eddd48-9a41-41cb-a284-80d01c7f8aad-kube-api-access-25pjr\") pod \"openshift-config-operator-7777fb866f-8nfd4\" (UID: \"15eddd48-9a41-41cb-a284-80d01c7f8aad\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.812601 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.851738 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.855253 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.872186 4870 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.891717 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.912451 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.933702 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.936820 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.945845 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.951479 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.964219 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.964535 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.972316 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.987464 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" Jan 30 08:11:52 crc kubenswrapper[4870]: I0130 08:11:52.990785 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.007763 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.010343 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.029125 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod739bcba5_d8ef_45fe_abf9_02d74d0d093c.slice/crio-d78272f33e8d184722965f0ee1a97e79bfa07cf5540dee502f1ae7804eb4a5df WatchSource:0}: Error finding container d78272f33e8d184722965f0ee1a97e79bfa07cf5540dee502f1ae7804eb4a5df: Status 404 returned error can't find the container with id d78272f33e8d184722965f0ee1a97e79bfa07cf5540dee502f1ae7804eb4a5df Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.032664 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.033889 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.049103 4870 request.go:700] Waited for 1.756211651s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/persistentvolumes/pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.054653 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.068036 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41bf206_4b95_49db_85b6_2e5fe6dcc5ef.slice/crio-c81f54e94b920e6c7cfd46f69dde3b3574a636bf82ebf62b99dce8708d66585b WatchSource:0}: Error finding container c81f54e94b920e6c7cfd46f69dde3b3574a636bf82ebf62b99dce8708d66585b: Status 404 returned error can't find the container with id c81f54e94b920e6c7cfd46f69dde3b3574a636bf82ebf62b99dce8708d66585b Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.076403 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" event={"ID":"739bcba5-d8ef-45fe-abf9-02d74d0d093c","Type":"ContainerStarted","Data":"d78272f33e8d184722965f0ee1a97e79bfa07cf5540dee502f1ae7804eb4a5df"} Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095024 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095110 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-config\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095130 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095153 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-serving-cert\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095314 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-service-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095381 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095504 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095549 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dp8v\" (UniqueName: \"kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095586 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-encryption-config\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095622 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095647 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095677 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095798 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5fkn\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-kube-api-access-c5fkn\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.095866 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbv4\" (UniqueName: \"kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096028 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096069 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6hw\" (UniqueName: \"kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096114 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096145 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096178 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096236 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096267 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096292 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096349 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-serving-cert\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a361e11a-9e2f-4abf-a8c1-783f328f13a9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096412 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfbv\" (UniqueName: \"kubernetes.io/projected/78280554-7b5b-4ccf-a674-2664144e4f5a-kube-api-access-dmfbv\") pod \"downloads-7954f5f757-v7bvt\" (UID: \"78280554-7b5b-4ccf-a674-2664144e4f5a\") " pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096441 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096544 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096568 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096726 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-dir\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096782 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbwqk\" (UniqueName: \"kubernetes.io/projected/02c6a6cf-5413-4524-a86c-11fa4a19821f-kube-api-access-hbwqk\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096842 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzzz\" (UniqueName: \"kubernetes.io/projected/faa3ca31-2951-4f0d-84f0-0b19a32c9927-kube-api-access-nzzzz\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096868 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.096918 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097015 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097047 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097070 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097113 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097134 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097150 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097171 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097210 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a361e11a-9e2f-4abf-a8c1-783f328f13a9-config\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097234 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tbk\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-kube-api-access-57tbk\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097256 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097278 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a361e11a-9e2f-4abf-a8c1-783f328f13a9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097295 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-client\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097324 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097349 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097372 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-policies\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097391 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097410 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-client\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097432 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097473 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097504 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46h2g\" (UniqueName: \"kubernetes.io/projected/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-kube-api-access-46h2g\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097525 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldpq\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097564 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.097582 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:53.597566135 +0000 UTC m=+152.293113244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097627 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097916 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097945 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.097977 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.098001 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx49z\" (UniqueName: \"kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.099239 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.099272 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200111 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200348 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9wg\" (UniqueName: \"kubernetes.io/projected/a8a1f91a-b48e-442f-9ab6-d704b3927315-kube-api-access-nm9wg\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200390 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200410 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-mountpoint-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200427 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzzz\" (UniqueName: \"kubernetes.io/projected/faa3ca31-2951-4f0d-84f0-0b19a32c9927-kube-api-access-nzzzz\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200446 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200463 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6dp\" (UniqueName: \"kubernetes.io/projected/a58a222f-98a0-46b4-9ea8-36a922f6a349-kube-api-access-kk6dp\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200479 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-images\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200496 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/539f32b7-3075-49f4-b9f6-e63ac1d76d61-proxy-tls\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200531 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200550 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200577 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a27619-258e-4bed-afb0-1706904c6f9d-config\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200597 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a361e11a-9e2f-4abf-a8c1-783f328f13a9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200613 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200632 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j4f7\" (UniqueName: \"kubernetes.io/projected/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-kube-api-access-6j4f7\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200654 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-profile-collector-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200673 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-policies\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200689 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200717 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-config\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200734 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-client\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200756 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200779 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200795 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46h2g\" (UniqueName: \"kubernetes.io/projected/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-kube-api-access-46h2g\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200814 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a27619-258e-4bed-afb0-1706904c6f9d-serving-cert\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200829 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldpq\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200866 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200945 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a58a222f-98a0-46b4-9ea8-36a922f6a349-config-volume\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.200998 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201013 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a1f91a-b48e-442f-9ab6-d704b3927315-cert\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201040 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xk7\" (UniqueName: \"kubernetes.io/projected/539f32b7-3075-49f4-b9f6-e63ac1d76d61-kube-api-access-p2xk7\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201060 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-srv-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201101 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-default-certificate\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201117 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201133 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7053ea40-6d30-41d8-bcb1-8f55e95feb22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201150 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crsg9\" (UniqueName: \"kubernetes.io/projected/836ae3f6-06f5-4996-9f9c-cacfb63fe855-kube-api-access-crsg9\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201165 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-socket-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201192 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-serving-cert\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201208 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201225 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58a222f-98a0-46b4-9ea8-36a922f6a349-metrics-tls\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201244 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgt5z\" (UniqueName: \"kubernetes.io/projected/9624fd43-bfa5-42c8-bebd-95a89988847d-kube-api-access-wgt5z\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201260 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznlq\" (UniqueName: \"kubernetes.io/projected/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-kube-api-access-bznlq\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201286 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201304 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwtm\" (UniqueName: \"kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201319 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88dw5\" (UniqueName: \"kubernetes.io/projected/53a09b74-1b42-4535-a853-0752b6d1f90a-kube-api-access-88dw5\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201348 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201374 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-service-ca-bundle\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201391 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdz27\" (UniqueName: \"kubernetes.io/projected/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-kube-api-access-wdz27\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201406 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-apiservice-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201420 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e64f35db-e72b-4d73-b501-7c2aff5cc609-proxy-tls\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201438 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lss9f\" (UniqueName: \"kubernetes.io/projected/78a27619-258e-4bed-afb0-1706904c6f9d-kube-api-access-lss9f\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201480 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201498 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-auth-proxy-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201515 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-plugins-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201532 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201549 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201565 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-serving-cert\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201586 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a361e11a-9e2f-4abf-a8c1-783f328f13a9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201604 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfbv\" (UniqueName: \"kubernetes.io/projected/78280554-7b5b-4ccf-a674-2664144e4f5a-kube-api-access-dmfbv\") pod \"downloads-7954f5f757-v7bvt\" (UID: \"78280554-7b5b-4ccf-a674-2664144e4f5a\") " pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201621 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-registration-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201637 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a6267b2-1222-4c0b-a890-c146d83b583d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201662 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201679 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201697 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201715 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbwqk\" (UniqueName: \"kubernetes.io/projected/02c6a6cf-5413-4524-a86c-11fa4a19821f-kube-api-access-hbwqk\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201743 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wsh5\" (UniqueName: \"kubernetes.io/projected/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-kube-api-access-6wsh5\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201766 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chvf4\" (UniqueName: \"kubernetes.io/projected/e64f35db-e72b-4d73-b501-7c2aff5cc609-kube-api-access-chvf4\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201784 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201799 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77lc5\" (UniqueName: \"kubernetes.io/projected/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-kube-api-access-77lc5\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201828 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201845 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201862 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8x7q\" (UniqueName: \"kubernetes.io/projected/f6d9ba19-88ea-489c-9f03-918e8b225e3b-kube-api-access-n8x7q\") pod \"migrator-59844c95c7-ps5nw\" (UID: \"f6d9ba19-88ea-489c-9f03-918e8b225e3b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201894 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201911 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201927 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201943 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-srv-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201960 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a361e11a-9e2f-4abf-a8c1-783f328f13a9-config\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201976 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57tbk\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-kube-api-access-57tbk\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.201995 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e740ffac-368d-45d5-89a8-25d370581945-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202023 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202344 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202421 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-client\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202441 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202789 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-csi-data-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202896 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-node-bootstrap-token\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202941 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-key\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.202988 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-service-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.203014 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e740ffac-368d-45d5-89a8-25d370581945-config\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.203780 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-serving-cert\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.203826 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.204000 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.204486 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.204521 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.204558 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx49z\" (UniqueName: \"kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.204583 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205250 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e740ffac-368d-45d5-89a8-25d370581945-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205296 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205317 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kll8p\" (UniqueName: \"kubernetes.io/projected/7053ea40-6d30-41d8-bcb1-8f55e95feb22-kube-api-access-kll8p\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205340 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22qbp\" (UniqueName: \"kubernetes.io/projected/1a6267b2-1222-4c0b-a890-c146d83b583d-kube-api-access-22qbp\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205526 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5rm\" (UniqueName: \"kubernetes.io/projected/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-kube-api-access-qv5rm\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205566 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205734 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-config\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205952 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.205981 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-service-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206025 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-certs\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206258 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dp8v\" (UniqueName: \"kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206287 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206307 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-webhook-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206491 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-encryption-config\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206520 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206540 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l822t\" (UniqueName: \"kubernetes.io/projected/46d623aa-7e54-4c20-aed3-3f125395a073-kube-api-access-l822t\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206706 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206735 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206759 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5fkn\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-kube-api-access-c5fkn\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.206779 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frbv4\" (UniqueName: \"kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207027 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-cabundle\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6hw\" (UniqueName: \"kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207074 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-metrics-certs\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207097 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207240 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207286 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207316 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/46d623aa-7e54-4c20-aed3-3f125395a073-tmpfs\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.207343 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.221094 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.221817 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.221922 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.221994 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dzd\" (UniqueName: \"kubernetes.io/projected/853280ad-9d5a-4fe9-852f-c0596e70dc49-kube-api-access-n5dzd\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222061 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qp2d\" (UniqueName: \"kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222134 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222187 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222242 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-stats-auth\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222356 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-dir\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222436 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222508 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9624fd43-bfa5-42c8-bebd-95a89988847d-machine-approver-tls\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/539f32b7-3075-49f4-b9f6-e63ac1d76d61-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.222481 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.223436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a361e11a-9e2f-4abf-a8c1-783f328f13a9-config\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.224987 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.225486 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.226245 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.227005 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.227633 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.227957 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.228242 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.231077 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.231254 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.231302 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.231767 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.231827 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.232396 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-policies\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.236092 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.236484 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.236686 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.237597 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.237757 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.237754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.238583 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.239287 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.239469 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.240019 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.240374 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-client\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.240370 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.240675 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.241316 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-config\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.241340 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.241583 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.242935 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.243531 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.243962 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.245532 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-encryption-config\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.246582 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/02c6a6cf-5413-4524-a86c-11fa4a19821f-etcd-service-ca\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.248128 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.248218 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/faa3ca31-2951-4f0d-84f0-0b19a32c9927-audit-dir\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.248318 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:53.748289847 +0000 UTC m=+152.443836956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.249163 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.249333 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.250474 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.251047 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.252652 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.253322 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.253602 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.259465 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02c6a6cf-5413-4524-a86c-11fa4a19821f-serving-cert\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.259646 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.268555 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-etcd-client\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.268963 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faa3ca31-2951-4f0d-84f0-0b19a32c9927-serving-cert\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.270727 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzzz\" (UniqueName: \"kubernetes.io/projected/faa3ca31-2951-4f0d-84f0-0b19a32c9927-kube-api-access-nzzzz\") pod \"apiserver-7bbb656c7d-brrxb\" (UID: \"faa3ca31-2951-4f0d-84f0-0b19a32c9927\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.272619 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a361e11a-9e2f-4abf-a8c1-783f328f13a9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.294511 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldpq\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.307376 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5fkn\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-kube-api-access-c5fkn\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.312981 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325074 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325448 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a27619-258e-4bed-afb0-1706904c6f9d-config\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325485 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j4f7\" (UniqueName: \"kubernetes.io/projected/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-kube-api-access-6j4f7\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-profile-collector-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325525 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-config\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325551 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a27619-258e-4bed-afb0-1706904c6f9d-serving-cert\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325578 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a58a222f-98a0-46b4-9ea8-36a922f6a349-config-volume\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325599 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a1f91a-b48e-442f-9ab6-d704b3927315-cert\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325619 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xk7\" (UniqueName: \"kubernetes.io/projected/539f32b7-3075-49f4-b9f6-e63ac1d76d61-kube-api-access-p2xk7\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325642 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7053ea40-6d30-41d8-bcb1-8f55e95feb22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325801 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-srv-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.325832 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-default-certificate\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.326418 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a27619-258e-4bed-afb0-1706904c6f9d-config\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.326560 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:53.826535303 +0000 UTC m=+152.522082412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.326812 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a58a222f-98a0-46b4-9ea8-36a922f6a349-config-volume\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.326904 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.326927 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgt5z\" (UniqueName: \"kubernetes.io/projected/9624fd43-bfa5-42c8-bebd-95a89988847d-kube-api-access-wgt5z\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327071 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-config\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327441 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327504 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crsg9\" (UniqueName: \"kubernetes.io/projected/836ae3f6-06f5-4996-9f9c-cacfb63fe855-kube-api-access-crsg9\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327530 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-socket-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327845 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58a222f-98a0-46b4-9ea8-36a922f6a349-metrics-tls\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.328542 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-socket-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.327868 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bznlq\" (UniqueName: \"kubernetes.io/projected/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-kube-api-access-bznlq\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.328981 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwtm\" (UniqueName: \"kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329006 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88dw5\" (UniqueName: \"kubernetes.io/projected/53a09b74-1b42-4535-a853-0752b6d1f90a-kube-api-access-88dw5\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329032 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-service-ca-bundle\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329072 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdz27\" (UniqueName: \"kubernetes.io/projected/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-kube-api-access-wdz27\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329094 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-apiservice-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329735 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-srv-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329802 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e64f35db-e72b-4d73-b501-7c2aff5cc609-proxy-tls\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329833 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lss9f\" (UniqueName: \"kubernetes.io/projected/78a27619-258e-4bed-afb0-1706904c6f9d-kube-api-access-lss9f\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-auth-proxy-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329904 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-plugins-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.329926 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a6267b2-1222-4c0b-a890-c146d83b583d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-registration-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330034 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330088 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wsh5\" (UniqueName: \"kubernetes.io/projected/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-kube-api-access-6wsh5\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330107 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chvf4\" (UniqueName: \"kubernetes.io/projected/e64f35db-e72b-4d73-b501-7c2aff5cc609-kube-api-access-chvf4\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330126 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330145 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77lc5\" (UniqueName: \"kubernetes.io/projected/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-kube-api-access-77lc5\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8x7q\" (UniqueName: \"kubernetes.io/projected/f6d9ba19-88ea-489c-9f03-918e8b225e3b-kube-api-access-n8x7q\") pod \"migrator-59844c95c7-ps5nw\" (UID: \"f6d9ba19-88ea-489c-9f03-918e8b225e3b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330228 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-srv-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330246 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e740ffac-368d-45d5-89a8-25d370581945-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330276 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330296 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-csi-data-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330315 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-node-bootstrap-token\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330332 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-key\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-service-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330380 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e740ffac-368d-45d5-89a8-25d370581945-config\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330397 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-serving-cert\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330429 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e740ffac-368d-45d5-89a8-25d370581945-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330449 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330472 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kll8p\" (UniqueName: \"kubernetes.io/projected/7053ea40-6d30-41d8-bcb1-8f55e95feb22-kube-api-access-kll8p\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330488 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22qbp\" (UniqueName: \"kubernetes.io/projected/1a6267b2-1222-4c0b-a890-c146d83b583d-kube-api-access-22qbp\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330505 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5rm\" (UniqueName: \"kubernetes.io/projected/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-kube-api-access-qv5rm\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-certs\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330551 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-webhook-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330569 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l822t\" (UniqueName: \"kubernetes.io/projected/46d623aa-7e54-4c20-aed3-3f125395a073-kube-api-access-l822t\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330600 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-cabundle\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330644 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-metrics-certs\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330668 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/46d623aa-7e54-4c20-aed3-3f125395a073-tmpfs\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330685 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330701 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dzd\" (UniqueName: \"kubernetes.io/projected/853280ad-9d5a-4fe9-852f-c0596e70dc49-kube-api-access-n5dzd\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330700 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-service-ca-bundle\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330718 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qp2d\" (UniqueName: \"kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330739 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-stats-auth\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330759 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330776 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9624fd43-bfa5-42c8-bebd-95a89988847d-machine-approver-tls\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330797 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/539f32b7-3075-49f4-b9f6-e63ac1d76d61-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330817 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-csi-data-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330835 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9wg\" (UniqueName: \"kubernetes.io/projected/a8a1f91a-b48e-442f-9ab6-d704b3927315-kube-api-access-nm9wg\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330865 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-mountpoint-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330897 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6dp\" (UniqueName: \"kubernetes.io/projected/a58a222f-98a0-46b4-9ea8-36a922f6a349-kube-api-access-kk6dp\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330917 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-images\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.330946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/539f32b7-3075-49f4-b9f6-e63ac1d76d61-proxy-tls\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.332715 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a1f91a-b48e-442f-9ab6-d704b3927315-cert\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.336014 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-profile-collector-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.336400 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.336822 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58a222f-98a0-46b4-9ea8-36a922f6a349-metrics-tls\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.337158 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/539f32b7-3075-49f4-b9f6-e63ac1d76d61-proxy-tls\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.337411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a27619-258e-4bed-afb0-1706904c6f9d-serving-cert\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.337523 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-registration-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.338578 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.338734 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.338759 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e64f35db-e72b-4d73-b501-7c2aff5cc609-proxy-tls\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.338979 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dp8v\" (UniqueName: \"kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v\") pod \"oauth-openshift-558db77b4-xxrkx\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.339574 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-apiservice-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.339589 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-cabundle\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.340353 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-node-bootstrap-token\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.340606 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e740ffac-368d-45d5-89a8-25d370581945-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.340976 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a6267b2-1222-4c0b-a890-c146d83b583d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.340992 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.341463 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-service-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.341566 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9624fd43-bfa5-42c8-bebd-95a89988847d-auth-proxy-config\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.341627 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-plugins-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.341868 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.342155 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-default-certificate\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.343014 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/46d623aa-7e54-4c20-aed3-3f125395a073-tmpfs\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.344025 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/853280ad-9d5a-4fe9-852f-c0596e70dc49-certs\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.344077 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e740ffac-368d-45d5-89a8-25d370581945-config\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.344583 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-mountpoint-dir\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.345397 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.346213 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-signing-key\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.346803 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7053ea40-6d30-41d8-bcb1-8f55e95feb22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.346980 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/46d623aa-7e54-4c20-aed3-3f125395a073-webhook-cert\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.347117 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-metrics-certs\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.350053 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e64f35db-e72b-4d73-b501-7c2aff5cc609-images\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.350411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/836ae3f6-06f5-4996-9f9c-cacfb63fe855-srv-cert\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.350600 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9624fd43-bfa5-42c8-bebd-95a89988847d-machine-approver-tls\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.351679 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-serving-cert\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.351849 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.351974 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-stats-auth\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.352846 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/539f32b7-3075-49f4-b9f6-e63ac1d76d61-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.353946 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/53a09b74-1b42-4535-a853-0752b6d1f90a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.356239 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.362155 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jr94b"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.364306 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbv4\" (UniqueName: \"kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4\") pod \"route-controller-manager-6576b87f9c-8p957\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.368407 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tbk\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-kube-api-access-57tbk\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.372090 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod042ed63b_a1a9_4072_ae87_71b9fb98280c.slice/crio-32bfae6fb5cd0844ffae17d236efc6733c35b12c04679da68fc6b71a72a5cb35 WatchSource:0}: Error finding container 32bfae6fb5cd0844ffae17d236efc6733c35b12c04679da68fc6b71a72a5cb35: Status 404 returned error can't find the container with id 32bfae6fb5cd0844ffae17d236efc6733c35b12c04679da68fc6b71a72a5cb35 Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.382760 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.389910 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.394655 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s6768"] Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.404444 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ae1460_1e39_4d11_9357_3e0111521a8e.slice/crio-b2e7a12e8e0a754fdea117b1a6d70f90fbef4cf713402e604223c579e30d53dd WatchSource:0}: Error finding container b2e7a12e8e0a754fdea117b1a6d70f90fbef4cf713402e604223c579e30d53dd: Status 404 returned error can't find the container with id b2e7a12e8e0a754fdea117b1a6d70f90fbef4cf713402e604223c579e30d53dd Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.417190 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2bdbb40-bece-420f-9ff7-bdeff90c8bd2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvwdd\" (UID: \"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.431504 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.432475 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:53.932411084 +0000 UTC m=+152.627958193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.434333 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46h2g\" (UniqueName: \"kubernetes.io/projected/2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715-kube-api-access-46h2g\") pod \"openshift-apiserver-operator-796bbdcf4f-vjvf5\" (UID: \"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.455997 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.467350 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a361e11a-9e2f-4abf-a8c1-783f328f13a9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nw244\" (UID: \"a361e11a-9e2f-4abf-a8c1-783f328f13a9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.468042 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx49z\" (UniqueName: \"kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z\") pod \"controller-manager-879f6c89f-kkf4z\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.477309 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.488072 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbwqk\" (UniqueName: \"kubernetes.io/projected/02c6a6cf-5413-4524-a86c-11fa4a19821f-kube-api-access-hbwqk\") pod \"etcd-operator-b45778765-ssxgx\" (UID: \"02c6a6cf-5413-4524-a86c-11fa4a19821f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.492952 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.500313 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2g2tj"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.507823 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6hw\" (UniqueName: \"kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw\") pod \"console-f9d7485db-2mj87\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.533542 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.534113 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.034097591 +0000 UTC m=+152.729644700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.534322 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0373f9a1_1537_4f29_905a_b0fb2affc113.slice/crio-c692b0d8c3c6b489e0a3be41a0f61c863ee7572f9ff504585ba318691ad55a3a WatchSource:0}: Error finding container c692b0d8c3c6b489e0a3be41a0f61c863ee7572f9ff504585ba318691ad55a3a: Status 404 returned error can't find the container with id c692b0d8c3c6b489e0a3be41a0f61c863ee7572f9ff504585ba318691ad55a3a Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.546015 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.547024 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55dd8c8-fb6d-450b-a80d-35e7223d2cff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-67xxt\" (UID: \"a55dd8c8-fb6d-450b-a80d-35e7223d2cff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.549374 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.549811 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfbv\" (UniqueName: \"kubernetes.io/projected/78280554-7b5b-4ccf-a674-2664144e4f5a-kube-api-access-dmfbv\") pod \"downloads-7954f5f757-v7bvt\" (UID: \"78280554-7b5b-4ccf-a674-2664144e4f5a\") " pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.554677 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cdzxd"] Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.558031 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15eddd48_9a41_41cb_a284_80d01c7f8aad.slice/crio-5f268d6983af1b7b879bee78cbfed53fb30971cb70a4d5ffa4f6c3907b23bc91 WatchSource:0}: Error finding container 5f268d6983af1b7b879bee78cbfed53fb30971cb70a4d5ffa4f6c3907b23bc91: Status 404 returned error can't find the container with id 5f268d6983af1b7b879bee78cbfed53fb30971cb70a4d5ffa4f6c3907b23bc91 Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.569737 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa3ca31_2951_4f0d_84f0_0b19a32c9927.slice/crio-914a412c6899858cfe6663eb8b2abe03171d9143153a4ffb21f681ec056f1903 WatchSource:0}: Error finding container 914a412c6899858cfe6663eb8b2abe03171d9143153a4ffb21f681ec056f1903: Status 404 returned error can't find the container with id 914a412c6899858cfe6663eb8b2abe03171d9143153a4ffb21f681ec056f1903 Jan 30 08:11:53 crc kubenswrapper[4870]: W0130 08:11:53.586934 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9be3c7b_3bd5_48ba_bd5e_affe9a29d8aa.slice/crio-d1e9dd4ef3e489bf0e9ad4838e5425badd078d9cbf455d47b5eb1daeb406663f WatchSource:0}: Error finding container d1e9dd4ef3e489bf0e9ad4838e5425badd078d9cbf455d47b5eb1daeb406663f: Status 404 returned error can't find the container with id d1e9dd4ef3e489bf0e9ad4838e5425badd078d9cbf455d47b5eb1daeb406663f Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.590201 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xk7\" (UniqueName: \"kubernetes.io/projected/539f32b7-3075-49f4-b9f6-e63ac1d76d61-kube-api-access-p2xk7\") pod \"machine-config-controller-84d6567774-zghdb\" (UID: \"539f32b7-3075-49f4-b9f6-e63ac1d76d61\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.609550 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.615089 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j4f7\" (UniqueName: \"kubernetes.io/projected/38a2c6cb-fd9d-42f6-8774-647c544bd0f9-kube-api-access-6j4f7\") pod \"multus-admission-controller-857f4d67dd-g6x2r\" (UID: \"38a2c6cb-fd9d-42f6-8774-647c544bd0f9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.621173 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.626312 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.628092 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgt5z\" (UniqueName: \"kubernetes.io/projected/9624fd43-bfa5-42c8-bebd-95a89988847d-kube-api-access-wgt5z\") pod \"machine-approver-56656f9798-vn7wx\" (UID: \"9624fd43-bfa5-42c8-bebd-95a89988847d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.634969 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.635413 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.135109238 +0000 UTC m=+152.830656347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.635981 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.637004 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.136968263 +0000 UTC m=+152.832515372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.644462 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.646731 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crsg9\" (UniqueName: \"kubernetes.io/projected/836ae3f6-06f5-4996-9f9c-cacfb63fe855-kube-api-access-crsg9\") pod \"catalog-operator-68c6474976-jkpz9\" (UID: \"836ae3f6-06f5-4996-9f9c-cacfb63fe855\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.665216 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznlq\" (UniqueName: \"kubernetes.io/projected/4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e-kube-api-access-bznlq\") pod \"authentication-operator-69f744f599-l6p59\" (UID: \"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.668178 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.677275 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.685936 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.687556 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.691169 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdz27\" (UniqueName: \"kubernetes.io/projected/fe4278ad-53ec-4f7f-9c39-a00b6fa505c5-kube-api-access-wdz27\") pod \"router-default-5444994796-dfwzs\" (UID: \"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5\") " pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.701530 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.731617 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.733444 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wsh5\" (UniqueName: \"kubernetes.io/projected/6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72-kube-api-access-6wsh5\") pod \"control-plane-machine-set-operator-78cbb6b69f-vzzk7\" (UID: \"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.745485 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwtm\" (UniqueName: \"kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm\") pod \"marketplace-operator-79b997595-jh9j6\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.746160 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.746652 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.246629475 +0000 UTC m=+152.942176584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.749572 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.755725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88dw5\" (UniqueName: \"kubernetes.io/projected/53a09b74-1b42-4535-a853-0752b6d1f90a-kube-api-access-88dw5\") pod \"olm-operator-6b444d44fb-fn5hp\" (UID: \"53a09b74-1b42-4535-a853-0752b6d1f90a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.766690 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8x7q\" (UniqueName: \"kubernetes.io/projected/f6d9ba19-88ea-489c-9f03-918e8b225e3b-kube-api-access-n8x7q\") pod \"migrator-59844c95c7-ps5nw\" (UID: \"f6d9ba19-88ea-489c-9f03-918e8b225e3b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.783002 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.783163 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.783794 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chvf4\" (UniqueName: \"kubernetes.io/projected/e64f35db-e72b-4d73-b501-7c2aff5cc609-kube-api-access-chvf4\") pod \"machine-config-operator-74547568cd-qnjxg\" (UID: \"e64f35db-e72b-4d73-b501-7c2aff5cc609\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.787377 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.795774 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.805537 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.808078 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kll8p\" (UniqueName: \"kubernetes.io/projected/7053ea40-6d30-41d8-bcb1-8f55e95feb22-kube-api-access-kll8p\") pod \"package-server-manager-789f6589d5-26vrg\" (UID: \"7053ea40-6d30-41d8-bcb1-8f55e95feb22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.816308 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.828804 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lss9f\" (UniqueName: \"kubernetes.io/projected/78a27619-258e-4bed-afb0-1706904c6f9d-kube-api-access-lss9f\") pod \"service-ca-operator-777779d784-xvt4n\" (UID: \"78a27619-258e-4bed-afb0-1706904c6f9d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.829221 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.847767 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e740ffac-368d-45d5-89a8-25d370581945-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ll2mq\" (UID: \"e740ffac-368d-45d5-89a8-25d370581945\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.848638 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.848972 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.34895828 +0000 UTC m=+153.044505389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.871095 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77lc5\" (UniqueName: \"kubernetes.io/projected/7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8-kube-api-access-77lc5\") pod \"service-ca-9c57cc56f-2l7mq\" (UID: \"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8\") " pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.887817 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5"] Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.892937 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dzd\" (UniqueName: \"kubernetes.io/projected/853280ad-9d5a-4fe9-852f-c0596e70dc49-kube-api-access-n5dzd\") pod \"machine-config-server-pm4xm\" (UID: \"853280ad-9d5a-4fe9-852f-c0596e70dc49\") " pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.913629 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22qbp\" (UniqueName: \"kubernetes.io/projected/1a6267b2-1222-4c0b-a890-c146d83b583d-kube-api-access-22qbp\") pod \"cluster-samples-operator-665b6dd947-tsghw\" (UID: \"1a6267b2-1222-4c0b-a890-c146d83b583d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.932335 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5rm\" (UniqueName: \"kubernetes.io/projected/20fcc16b-f2b2-4a33-a8b2-567bec77d7ca-kube-api-access-qv5rm\") pod \"csi-hostpathplugin-k7wnt\" (UID: \"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca\") " pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.952869 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:53 crc kubenswrapper[4870]: E0130 08:11:53.953212 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.453169312 +0000 UTC m=+153.148716591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.957183 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l822t\" (UniqueName: \"kubernetes.io/projected/46d623aa-7e54-4c20-aed3-3f125395a073-kube-api-access-l822t\") pod \"packageserver-d55dfcdfc-dhpbr\" (UID: \"46d623aa-7e54-4c20-aed3-3f125395a073\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.979283 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.981959 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qp2d\" (UniqueName: \"kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d\") pod \"collect-profiles-29496000-25p92\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:53 crc kubenswrapper[4870]: I0130 08:11:53.998925 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6dp\" (UniqueName: \"kubernetes.io/projected/a58a222f-98a0-46b4-9ea8-36a922f6a349-kube-api-access-kk6dp\") pod \"dns-default-mnsdp\" (UID: \"a58a222f-98a0-46b4-9ea8-36a922f6a349\") " pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:54 crc kubenswrapper[4870]: W0130 08:11:54.002406 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9624fd43_bfa5_42c8_bebd_95a89988847d.slice/crio-b5b191882c8214cf7abd45877685493aaf17bab3ed72290121d2be06c0528f59 WatchSource:0}: Error finding container b5b191882c8214cf7abd45877685493aaf17bab3ed72290121d2be06c0528f59: Status 404 returned error can't find the container with id b5b191882c8214cf7abd45877685493aaf17bab3ed72290121d2be06c0528f59 Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.011747 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9wg\" (UniqueName: \"kubernetes.io/projected/a8a1f91a-b48e-442f-9ab6-d704b3927315-kube-api-access-nm9wg\") pod \"ingress-canary-szhwx\" (UID: \"a8a1f91a-b48e-442f-9ab6-d704b3927315\") " pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.023112 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.023676 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.041795 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.056982 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.060019 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.060585 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.560555427 +0000 UTC m=+153.256102536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.063821 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.083150 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.122659 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.126933 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ssxgx"] Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.129548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" event={"ID":"b2cd7eb7-87cb-44dc-a01f-17985460c12c","Type":"ContainerStarted","Data":"bcb05793641bd67c1acbeee4554363d30fd2eaaf8071b45c4e65086e61e263dd"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.129613 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" event={"ID":"b2cd7eb7-87cb-44dc-a01f-17985460c12c","Type":"ContainerStarted","Data":"30261041a241d2ac251a9bc33bdaec03f9eb7a434d15696d60f69b0dba1f5cbf"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.134434 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" event={"ID":"41ae1460-1e39-4d11-9357-3e0111521a8e","Type":"ContainerStarted","Data":"1702f30a64f9b42998cb07d3131e3070d10837649d10efef360d13a4b1741400"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.134467 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" event={"ID":"41ae1460-1e39-4d11-9357-3e0111521a8e","Type":"ContainerStarted","Data":"b2e7a12e8e0a754fdea117b1a6d70f90fbef4cf713402e604223c579e30d53dd"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.135964 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.143664 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.156396 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" event={"ID":"9624fd43-bfa5-42c8-bebd-95a89988847d","Type":"ContainerStarted","Data":"b5b191882c8214cf7abd45877685493aaf17bab3ed72290121d2be06c0528f59"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.159002 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" event={"ID":"d4876c72-6cd1-43e0-b44a-45c4bd69e91f","Type":"ContainerStarted","Data":"94d52f9687de877d5fd97b94963947e16acbe6d1f11849a8cb9317ae4e717ce7"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.161368 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.162111 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.6620843 +0000 UTC m=+153.357631409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.168445 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.170752 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" event={"ID":"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa","Type":"ContainerStarted","Data":"cb8b82bd4cee4a06ed3352e7c2149843621ae9f790652a0e9beb76836c9e0e3d"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.170791 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" event={"ID":"b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa","Type":"ContainerStarted","Data":"d1e9dd4ef3e489bf0e9ad4838e5425badd078d9cbf455d47b5eb1daeb406663f"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.172303 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.189672 4870 patch_prober.go:28] interesting pod/console-operator-58897d9998-cdzxd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.189722 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-szhwx" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.189751 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" podUID="b9be3c7b-3bd5-48ba-bd5e-affe9a29d8aa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.191559 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pm4xm" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.207671 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" event={"ID":"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef","Type":"ContainerStarted","Data":"cc9b28324b264a123ca85a62a983f8f0edd9c35267625c7a8f40322a0b38f98d"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.207711 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" event={"ID":"b41bf206-4b95-49db-85b6-2e5fe6dcc5ef","Type":"ContainerStarted","Data":"c81f54e94b920e6c7cfd46f69dde3b3574a636bf82ebf62b99dce8708d66585b"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.222768 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" event={"ID":"15eddd48-9a41-41cb-a284-80d01c7f8aad","Type":"ContainerStarted","Data":"5ef5c75822c22a3f709c47c243272f7efcd72e282e48476b6a5544787711f3a1"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.222806 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" event={"ID":"15eddd48-9a41-41cb-a284-80d01c7f8aad","Type":"ContainerStarted","Data":"5f268d6983af1b7b879bee78cbfed53fb30971cb70a4d5ffa4f6c3907b23bc91"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.255686 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" event={"ID":"042ed63b-a1a9-4072-ae87-71b9fb98280c","Type":"ContainerStarted","Data":"85dfa0294fdcdeed334d2765c73e93b189d3c341512f6440f8c0311b0059dda2"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.255758 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" event={"ID":"042ed63b-a1a9-4072-ae87-71b9fb98280c","Type":"ContainerStarted","Data":"32bfae6fb5cd0844ffae17d236efc6733c35b12c04679da68fc6b71a72a5cb35"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.261748 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt"] Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.263446 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.263888 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.763857469 +0000 UTC m=+153.459404568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.263516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" event={"ID":"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715","Type":"ContainerStarted","Data":"e27beb861beaaa06426e0756d71bc535d8643b5b6bf47be8356eab351436ab77"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.281292 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" event={"ID":"739bcba5-d8ef-45fe-abf9-02d74d0d093c","Type":"ContainerStarted","Data":"048bfa0b79a0299901e375bbc07072944c6537030ca9a7cf23b20e0d3754683f"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.285337 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" event={"ID":"0373f9a1-1537-4f29-905a-b0fb2affc113","Type":"ContainerStarted","Data":"c692b0d8c3c6b489e0a3be41a0f61c863ee7572f9ff504585ba318691ad55a3a"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.294185 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" event={"ID":"c488e93c-573d-4d04-a272-699af1059a0e","Type":"ContainerStarted","Data":"704010f76326b14b45acb49d52a3c39fd09423589bc0b99052ca69b69f06912c"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.297095 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" event={"ID":"faa3ca31-2951-4f0d-84f0-0b19a32c9927","Type":"ContainerStarted","Data":"914a412c6899858cfe6663eb8b2abe03171d9143153a4ffb21f681ec056f1903"} Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.366078 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.366322 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.866294768 +0000 UTC m=+153.561841867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.366846 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.367475 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.867428361 +0000 UTC m=+153.562975480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: W0130 08:11:54.458725 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda55dd8c8_fb6d_450b_a80d_35e7223d2cff.slice/crio-414251ee3c4819ded89c78bee12a2c46a661efd5efaa339b79e4a8f836d91776 WatchSource:0}: Error finding container 414251ee3c4819ded89c78bee12a2c46a661efd5efaa339b79e4a8f836d91776: Status 404 returned error can't find the container with id 414251ee3c4819ded89c78bee12a2c46a661efd5efaa339b79e4a8f836d91776 Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.473642 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.475965 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:54.975945459 +0000 UTC m=+153.671492568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.578026 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.578774 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.07874677 +0000 UTC m=+153.774293879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.632643 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" podStartSLOduration=127.632622957 podStartE2EDuration="2m7.632622957s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:54.631974728 +0000 UTC m=+153.327521837" watchObservedRunningTime="2026-01-30 08:11:54.632622957 +0000 UTC m=+153.328170066" Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.679293 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.679434 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.179410226 +0000 UTC m=+153.874957335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.679958 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.680307 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.180298692 +0000 UTC m=+153.875845801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.781714 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.782133 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.282113883 +0000 UTC m=+153.977660992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.885343 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.886156 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.386138119 +0000 UTC m=+154.081685218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.986548 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:54 crc kubenswrapper[4870]: E0130 08:11:54.986891 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.486862518 +0000 UTC m=+154.182409627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:54 crc kubenswrapper[4870]: I0130 08:11:54.987212 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7t7rr" podStartSLOduration=127.987180137 podStartE2EDuration="2m7.987180137s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:54.987109235 +0000 UTC m=+153.682656344" watchObservedRunningTime="2026-01-30 08:11:54.987180137 +0000 UTC m=+153.682727246" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.025946 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l6p59"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.044279 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.088389 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.088970 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.588951426 +0000 UTC m=+154.284498535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.187094 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mtt79" podStartSLOduration=128.187076528 podStartE2EDuration="2m8.187076528s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.142903246 +0000 UTC m=+153.838450355" watchObservedRunningTime="2026-01-30 08:11:55.187076528 +0000 UTC m=+153.882623637" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.190014 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.190207 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.69018137 +0000 UTC m=+154.385728489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.190293 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.190705 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.690688345 +0000 UTC m=+154.386235454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.222027 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6lcwc" podStartSLOduration=128.222008338 podStartE2EDuration="2m8.222008338s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.220591036 +0000 UTC m=+153.916138135" watchObservedRunningTime="2026-01-30 08:11:55.222008338 +0000 UTC m=+153.917555447" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.252272 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.252348 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.294824 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.297185 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.797164793 +0000 UTC m=+154.492711902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.343334 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" event={"ID":"2dc32e9a-d384-4f5e-b1ea-c9ac2f57e715","Type":"ContainerStarted","Data":"6f70705e79c4782be5d21a500bb13d37cd9c32f5a8a48fc77f6a5dc65f34df8f"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.360056 4870 generic.go:334] "Generic (PLEG): container finished" podID="faa3ca31-2951-4f0d-84f0-0b19a32c9927" containerID="a613a00984bbb4472b7e0c5e6ab507edb340599de69eed4287f29d41e305eeac" exitCode=0 Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.360218 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" event={"ID":"faa3ca31-2951-4f0d-84f0-0b19a32c9927","Type":"ContainerDied","Data":"a613a00984bbb4472b7e0c5e6ab507edb340599de69eed4287f29d41e305eeac"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.365860 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dfwzs" event={"ID":"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5","Type":"ContainerStarted","Data":"a6c4d7c89dd0a82eb8beb0b07000eaf09866aad342a4bd9eb2c90351b9a5a6c0"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.366097 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dfwzs" event={"ID":"fe4278ad-53ec-4f7f-9c39-a00b6fa505c5","Type":"ContainerStarted","Data":"70cc36eaa4b6b941f775779319cd1468328261a8aede7a08f5368a578afe8f57"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.376509 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.382403 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" event={"ID":"a361e11a-9e2f-4abf-a8c1-783f328f13a9","Type":"ContainerStarted","Data":"6a1a289b7e1ebbebf9ca0e8efc09f4b2c0ebe493771e8e186e9f8562b8e03485"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.399565 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.399957 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:55.899944152 +0000 UTC m=+154.595491251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.414529 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.422753 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g6x2r"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.428079 4870 generic.go:334] "Generic (PLEG): container finished" podID="0373f9a1-1537-4f29-905a-b0fb2affc113" containerID="161645af44e76731c52c9d8a0f7698cb7ed4ac904bddb6e18b4e43f4d491d7c3" exitCode=0 Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.428307 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" event={"ID":"0373f9a1-1537-4f29-905a-b0fb2affc113","Type":"ContainerDied","Data":"161645af44e76731c52c9d8a0f7698cb7ed4ac904bddb6e18b4e43f4d491d7c3"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.428337 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" event={"ID":"0373f9a1-1537-4f29-905a-b0fb2affc113","Type":"ContainerStarted","Data":"493c238fb45d7f8e5b3102ed782b94c8080ba2b5d6e05d93d843643190c1f7a0"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.431102 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" event={"ID":"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e","Type":"ContainerStarted","Data":"4bc0e9346d43dfa548f826bccbd9a7d7b8966b1950a6ab5aeb4c0afa560b8b8a"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.431480 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.438698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" event={"ID":"042ed63b-a1a9-4072-ae87-71b9fb98280c","Type":"ContainerStarted","Data":"0d306a2ea52be2b8748d5e188b1321971efa260e2a0ad0a3e62ebe050acf1c10"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.455313 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.461432 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pm4xm" event={"ID":"853280ad-9d5a-4fe9-852f-c0596e70dc49","Type":"ContainerStarted","Data":"d5b998d8cbbc41e213c1adaf2af84c5846cf77ac184bacad6b45dd5402381811"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.461477 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pm4xm" event={"ID":"853280ad-9d5a-4fe9-852f-c0596e70dc49","Type":"ContainerStarted","Data":"cd41dbdded3b7cd061795fa284f773eaad7c276bb3dd35ffbdab3935139c4dde"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.480292 4870 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xxrkx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.480341 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Jan 30 08:11:55 crc kubenswrapper[4870]: W0130 08:11:55.492552 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa49ce7_f902_408a_94f1_da14a661e813.slice/crio-11900425e10bfa9bf6c9c649d5dac8048b1ed7e104a45655b98935b712a80d21 WatchSource:0}: Error finding container 11900425e10bfa9bf6c9c649d5dac8048b1ed7e104a45655b98935b712a80d21: Status 404 returned error can't find the container with id 11900425e10bfa9bf6c9c649d5dac8048b1ed7e104a45655b98935b712a80d21 Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.500806 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.502083 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.002049962 +0000 UTC m=+154.697597071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.502382 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" event={"ID":"c488e93c-573d-4d04-a272-699af1059a0e","Type":"ContainerStarted","Data":"1c707abb6d42fe8dbbf92f521b8a55ebafd7443ac15de36d0828dd259789e664"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.504567 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.505185 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: W0130 08:11:55.505651 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a2c6cb_fd9d_42f6_8774_647c544bd0f9.slice/crio-33351ac2c30e695815d9f91b72215544b26c926007cc52ebb8d5ab63cdf14cf9 WatchSource:0}: Error finding container 33351ac2c30e695815d9f91b72215544b26c926007cc52ebb8d5ab63cdf14cf9: Status 404 returned error can't find the container with id 33351ac2c30e695815d9f91b72215544b26c926007cc52ebb8d5ab63cdf14cf9 Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.507518 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.007504192 +0000 UTC m=+154.703051301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.509493 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" event={"ID":"a55dd8c8-fb6d-450b-a80d-35e7223d2cff","Type":"ContainerStarted","Data":"a070c6ca6b9b4e2bad997b55632f9fbafd24283227d2d702a8aa91d7cb030972"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.509531 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" event={"ID":"a55dd8c8-fb6d-450b-a80d-35e7223d2cff","Type":"ContainerStarted","Data":"414251ee3c4819ded89c78bee12a2c46a661efd5efaa339b79e4a8f836d91776"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.513598 4870 generic.go:334] "Generic (PLEG): container finished" podID="15eddd48-9a41-41cb-a284-80d01c7f8aad" containerID="5ef5c75822c22a3f709c47c243272f7efcd72e282e48476b6a5544787711f3a1" exitCode=0 Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.513657 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" event={"ID":"15eddd48-9a41-41cb-a284-80d01c7f8aad","Type":"ContainerDied","Data":"5ef5c75822c22a3f709c47c243272f7efcd72e282e48476b6a5544787711f3a1"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.520829 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" event={"ID":"02c6a6cf-5413-4524-a86c-11fa4a19821f","Type":"ContainerStarted","Data":"306d3e8ee5b77c9fa36ff766942e7989e0f0754c0a0ad3d24ee62ac20f011b64"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.539492 4870 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8p957 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.539570 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.546790 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" event={"ID":"41ae1460-1e39-4d11-9357-3e0111521a8e","Type":"ContainerStarted","Data":"5387a8bb1939e3c9fb3b3f3dcf0a447bd8edb22a09de62e7ff0c85039e608aa1"} Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.610169 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.611630 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.111611561 +0000 UTC m=+154.807158670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.712535 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.714422 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.214385079 +0000 UTC m=+154.909932188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.729194 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.780440 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.782487 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.782534 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.784396 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.784656 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jr94b" podStartSLOduration=128.78463885 podStartE2EDuration="2m8.78463885s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.779762867 +0000 UTC m=+154.475309976" watchObservedRunningTime="2026-01-30 08:11:55.78463885 +0000 UTC m=+154.480185959" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.797685 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.803025 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.818107 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.818917 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pm4xm" podStartSLOduration=5.81890729 podStartE2EDuration="5.81890729s" podCreationTimestamp="2026-01-30 08:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.814230342 +0000 UTC m=+154.509777461" watchObservedRunningTime="2026-01-30 08:11:55.81890729 +0000 UTC m=+154.514454399" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.821515 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.821861 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.321844077 +0000 UTC m=+155.017391186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.821944 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.822173 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.322165657 +0000 UTC m=+155.017712766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.837241 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v7bvt"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.866799 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vjvf5" podStartSLOduration=129.866780541 podStartE2EDuration="2m9.866780541s" podCreationTimestamp="2026-01-30 08:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.862868426 +0000 UTC m=+154.558415535" watchObservedRunningTime="2026-01-30 08:11:55.866780541 +0000 UTC m=+154.562327650" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.923784 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.923977 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.423948596 +0000 UTC m=+155.119495705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.926359 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:55 crc kubenswrapper[4870]: E0130 08:11:55.928210 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.428193401 +0000 UTC m=+155.123740510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.962904 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" podStartSLOduration=128.962884613 podStartE2EDuration="2m8.962884613s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.902205485 +0000 UTC m=+154.597752594" watchObservedRunningTime="2026-01-30 08:11:55.962884613 +0000 UTC m=+154.658431722" Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.971351 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.976839 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-k7wnt"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.982822 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92"] Jan 30 08:11:55 crc kubenswrapper[4870]: I0130 08:11:55.990864 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-s6768" podStartSLOduration=128.990850577 podStartE2EDuration="2m8.990850577s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:55.989583021 +0000 UTC m=+154.685130130" watchObservedRunningTime="2026-01-30 08:11:55.990850577 +0000 UTC m=+154.686397686" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.027636 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.028383 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.528369194 +0000 UTC m=+155.223916303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.032546 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-67xxt" podStartSLOduration=129.032528006 podStartE2EDuration="2m9.032528006s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.030492356 +0000 UTC m=+154.726039465" watchObservedRunningTime="2026-01-30 08:11:56.032528006 +0000 UTC m=+154.728075125" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.043025 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.053968 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mnsdp"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.068930 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.105387 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" podStartSLOduration=130.105356642 podStartE2EDuration="2m10.105356642s" podCreationTimestamp="2026-01-30 08:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.098794479 +0000 UTC m=+154.794341588" watchObservedRunningTime="2026-01-30 08:11:56.105356642 +0000 UTC m=+154.800903751" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.114207 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.129702 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dfwzs" podStartSLOduration=129.1296825 podStartE2EDuration="2m9.1296825s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.127731031 +0000 UTC m=+154.823278140" watchObservedRunningTime="2026-01-30 08:11:56.1296825 +0000 UTC m=+154.825229609" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.130684 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.131129 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.631107821 +0000 UTC m=+155.326654930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: W0130 08:11:56.133268 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20fcc16b_f2b2_4a33_a8b2_567bec77d7ca.slice/crio-e250f73a121c440b98359ca6a9970a763e46ca651a33f5e4c1db99fb7006c06b WatchSource:0}: Error finding container e250f73a121c440b98359ca6a9970a763e46ca651a33f5e4c1db99fb7006c06b: Status 404 returned error can't find the container with id e250f73a121c440b98359ca6a9970a763e46ca651a33f5e4c1db99fb7006c06b Jan 30 08:11:56 crc kubenswrapper[4870]: W0130 08:11:56.133720 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93fd6b37_eee2_4fd5_aa18_51eecea65a3b.slice/crio-1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619 WatchSource:0}: Error finding container 1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619: Status 404 returned error can't find the container with id 1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619 Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.145494 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cdzxd" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.154345 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-szhwx"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.159767 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" podStartSLOduration=129.159750606 podStartE2EDuration="2m9.159750606s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.159316733 +0000 UTC m=+154.854863842" watchObservedRunningTime="2026-01-30 08:11:56.159750606 +0000 UTC m=+154.855297715" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.170119 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.204219 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.231479 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.231707 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.731667895 +0000 UTC m=+155.427215004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.231977 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.232539 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.732528871 +0000 UTC m=+155.428075980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: W0130 08:11:56.240132 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode740ffac_368d_45d5_89a8_25d370581945.slice/crio-d5b145700f9be201c79c1ea577ddea45dedffbdfbc9768d25f803b9a284eafeb WatchSource:0}: Error finding container d5b145700f9be201c79c1ea577ddea45dedffbdfbc9768d25f803b9a284eafeb: Status 404 returned error can't find the container with id d5b145700f9be201c79c1ea577ddea45dedffbdfbc9768d25f803b9a284eafeb Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.289729 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-2l7mq"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.297022 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg"] Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.332999 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.333119 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.833090394 +0000 UTC m=+155.528637503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.333997 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.334429 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.834413844 +0000 UTC m=+155.529960953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: W0130 08:11:56.359075 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a27619_258e_4bed_afb0_1706904c6f9d.slice/crio-6046da1499505b13ecbf857cd5774cec4a983a18a644f4c11ced33d3d17c2dc7 WatchSource:0}: Error finding container 6046da1499505b13ecbf857cd5774cec4a983a18a644f4c11ced33d3d17c2dc7: Status 404 returned error can't find the container with id 6046da1499505b13ecbf857cd5774cec4a983a18a644f4c11ced33d3d17c2dc7 Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.435440 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.435575 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.935549604 +0000 UTC m=+155.631096713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.435724 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.436106 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:56.9360983 +0000 UTC m=+155.631645409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.536745 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.537149 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.037130938 +0000 UTC m=+155.732678047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.573224 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" event={"ID":"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72","Type":"ContainerStarted","Data":"2b8c6d0d251f89dca3e2b2cf029ae7c867dfac014052789646e8a14dc4db877f"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.578192 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mnsdp" event={"ID":"a58a222f-98a0-46b4-9ea8-36a922f6a349","Type":"ContainerStarted","Data":"ed89b14a22a777a3fe4673e13d2f17c414aa363605f67f6dbc02b53bacb1dc79"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.586181 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" event={"ID":"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2","Type":"ContainerStarted","Data":"bc9a22ef20f71808a3262567697360e8e3a67bdac6f2914c62a14ad9e0a6d13e"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.586301 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" event={"ID":"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2","Type":"ContainerStarted","Data":"3ca46cade8d96a03abd528afe76eaff05d47d26d90f9fb94c295000c92ddc4c0"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.588206 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" event={"ID":"f6d9ba19-88ea-489c-9f03-918e8b225e3b","Type":"ContainerStarted","Data":"8184c740e6f01b41b1ba638aa4e9de1ad5d506a2e0544a861440c012bf6db070"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.589363 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" event={"ID":"93fd6b37-eee2-4fd5-aa18-51eecea65a3b","Type":"ContainerStarted","Data":"1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.593689 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-szhwx" event={"ID":"a8a1f91a-b48e-442f-9ab6-d704b3927315","Type":"ContainerStarted","Data":"dc63f0a9d09e2f4a8a3ef0910de53b166a39b35eddcc581b73707d038096142a"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.603683 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" event={"ID":"4c6eb485-ba6b-4f6a-8b30-a9d1da93b21e","Type":"ContainerStarted","Data":"f0aebbe0474a3162ab4bcab185203df3f074bf749e9e5d63d6da4ddca1de2204"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.611679 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" event={"ID":"d4876c72-6cd1-43e0-b44a-45c4bd69e91f","Type":"ContainerStarted","Data":"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.613952 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ssxgx" event={"ID":"02c6a6cf-5413-4524-a86c-11fa4a19821f","Type":"ContainerStarted","Data":"aca766ffc64446f9915e30314c15cacd7356287ff174b7d03d0ec4672b1e2b47"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.616182 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" event={"ID":"38a2c6cb-fd9d-42f6-8774-647c544bd0f9","Type":"ContainerStarted","Data":"7e6196d52b3ae3ba4ea0ca83ff2e8c24460d40875921197226d496ba025d29d2"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.616231 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" event={"ID":"38a2c6cb-fd9d-42f6-8774-647c544bd0f9","Type":"ContainerStarted","Data":"33351ac2c30e695815d9f91b72215544b26c926007cc52ebb8d5ab63cdf14cf9"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.627160 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-l6p59" podStartSLOduration=130.627137911 podStartE2EDuration="2m10.627137911s" podCreationTimestamp="2026-01-30 08:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.626844432 +0000 UTC m=+155.322391551" watchObservedRunningTime="2026-01-30 08:11:56.627137911 +0000 UTC m=+155.322685020" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.638076 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" event={"ID":"539f32b7-3075-49f4-b9f6-e63ac1d76d61","Type":"ContainerStarted","Data":"213bc74675eba6ab680346a182a37c2671b6f7a547a41ca076d57df4ed3db570"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.638457 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" event={"ID":"539f32b7-3075-49f4-b9f6-e63ac1d76d61","Type":"ContainerStarted","Data":"a21e071cdedb2464c3e329e037d2e88f2545eacc6c0da5b957ceabc790a4d541"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.639824 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.640350 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.140330479 +0000 UTC m=+155.835877588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.651766 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" event={"ID":"53a09b74-1b42-4535-a853-0752b6d1f90a","Type":"ContainerStarted","Data":"dfbb14b3761f36ee8e53212a7b374ccf14d9ba35d639bbc27baf76ccdeee4c61"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.656437 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" event={"ID":"0373f9a1-1537-4f29-905a-b0fb2affc113","Type":"ContainerStarted","Data":"ef577f206b990cd40e889159cb224ec7ccc76453e9f23d1415ab27b1088bc3fd"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.659449 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" event={"ID":"8ede517d-773d-4f0b-8c0a-42ae13359f95","Type":"ContainerStarted","Data":"97e57ef1c7ce66b357f1ed6e1f8847cdeaccd06e556466aa82594a1548b78355"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.659495 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" event={"ID":"8ede517d-773d-4f0b-8c0a-42ae13359f95","Type":"ContainerStarted","Data":"3b948b615fba724f1687e73e5fcca06ca297443c072f9ebaf1a3471eb522792b"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.660484 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.661835 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jh9j6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.661884 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.662647 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" event={"ID":"faa3ca31-2951-4f0d-84f0-0b19a32c9927","Type":"ContainerStarted","Data":"e1356874b5553c3de0e80c8027a34dccd9503ab10c1652d6896f2d3991991483"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.682396 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" event={"ID":"836ae3f6-06f5-4996-9f9c-cacfb63fe855","Type":"ContainerStarted","Data":"3c42782e2042e17a62aba7a88551bb76c4354f2317c7baf6a90dbef09771bb79"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.695273 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" event={"ID":"e740ffac-368d-45d5-89a8-25d370581945","Type":"ContainerStarted","Data":"d5b145700f9be201c79c1ea577ddea45dedffbdfbc9768d25f803b9a284eafeb"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.702752 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" podStartSLOduration=130.702728029 podStartE2EDuration="2m10.702728029s" podCreationTimestamp="2026-01-30 08:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.701134292 +0000 UTC m=+155.396681401" watchObservedRunningTime="2026-01-30 08:11:56.702728029 +0000 UTC m=+155.398275138" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.725702 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" event={"ID":"9624fd43-bfa5-42c8-bebd-95a89988847d","Type":"ContainerStarted","Data":"819d0a3ba0aba2847ea6866af6b434dc03a10af1100b9de4cf8a2b6c90cd7fc0"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.725752 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" event={"ID":"9624fd43-bfa5-42c8-bebd-95a89988847d","Type":"ContainerStarted","Data":"43a212698789375e32b58e7d2ff8c87558474660a036cc20b6d99b35d2a87a79"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.736834 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" podStartSLOduration=129.736810183 podStartE2EDuration="2m9.736810183s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.725622884 +0000 UTC m=+155.421169993" watchObservedRunningTime="2026-01-30 08:11:56.736810183 +0000 UTC m=+155.432357292" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.744167 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.746364 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.246340624 +0000 UTC m=+155.941887733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.757990 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" podStartSLOduration=129.757965266 podStartE2EDuration="2m9.757965266s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.754543395 +0000 UTC m=+155.450090504" watchObservedRunningTime="2026-01-30 08:11:56.757965266 +0000 UTC m=+155.453512365" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.760126 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" event={"ID":"15eddd48-9a41-41cb-a284-80d01c7f8aad","Type":"ContainerStarted","Data":"a661d6d4097ab60ce214d8b9995fb7404122bfb23be4da944f9d5015d57b7db1"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.760350 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.787975 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:11:56 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:11:56 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:11:56 crc kubenswrapper[4870]: healthz check failed Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.788466 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.789009 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vn7wx" podStartSLOduration=130.788984701 podStartE2EDuration="2m10.788984701s" podCreationTimestamp="2026-01-30 08:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.788444214 +0000 UTC m=+155.483991333" watchObservedRunningTime="2026-01-30 08:11:56.788984701 +0000 UTC m=+155.484531810" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.800735 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v7bvt" event={"ID":"78280554-7b5b-4ccf-a674-2664144e4f5a","Type":"ContainerStarted","Data":"7ac8837b500e7e99f2a04037c1cd17673291f48d58dc455d6093994bf9822a5b"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.821065 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" podStartSLOduration=129.821045006 podStartE2EDuration="2m9.821045006s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.819761448 +0000 UTC m=+155.515308557" watchObservedRunningTime="2026-01-30 08:11:56.821045006 +0000 UTC m=+155.516592115" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.826333 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" event={"ID":"e64f35db-e72b-4d73-b501-7c2aff5cc609","Type":"ContainerStarted","Data":"d5afab07dc1508ba107c7017534fb7ab1bb2586349374540103867bc099e0d5a"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.826404 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" event={"ID":"e64f35db-e72b-4d73-b501-7c2aff5cc609","Type":"ContainerStarted","Data":"a2a4fcc7e4c608ac838c216112233cee8b4ddd3caaf5a3361e25bb63b66c7706"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.829603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" event={"ID":"46d623aa-7e54-4c20-aed3-3f125395a073","Type":"ContainerStarted","Data":"7df8b42d217ab65278902d2dcee76abde43d6768681053cb4e71d635d19c2d5f"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.831600 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" event={"ID":"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca","Type":"ContainerStarted","Data":"e250f73a121c440b98359ca6a9970a763e46ca651a33f5e4c1db99fb7006c06b"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.832902 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" event={"ID":"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7","Type":"ContainerStarted","Data":"a77c631c25ab71f0cef3c69513d9b0866e7e0d3305252072a16f62fc9dac93dd"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.832929 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" event={"ID":"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7","Type":"ContainerStarted","Data":"c3d25320838bc55388c93ea63e175ab91cf4b33328f8715faa7380d5ef4ae27f"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.834836 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.834924 4870 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kkf4z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.834958 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.840615 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2mj87" event={"ID":"2aa49ce7-f902-408a-94f1-da14a661e813","Type":"ContainerStarted","Data":"8f4187e8ca6a92ee4bd9e6838556b7bbedaba64d18a5aff0c37ced233ebdc3dd"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.840662 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2mj87" event={"ID":"2aa49ce7-f902-408a-94f1-da14a661e813","Type":"ContainerStarted","Data":"11900425e10bfa9bf6c9c649d5dac8048b1ed7e104a45655b98935b712a80d21"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.848511 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.851190 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.351169694 +0000 UTC m=+156.046716983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.852386 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" event={"ID":"78a27619-258e-4bed-afb0-1706904c6f9d","Type":"ContainerStarted","Data":"6046da1499505b13ecbf857cd5774cec4a983a18a644f4c11ced33d3d17c2dc7"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.862706 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" event={"ID":"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8","Type":"ContainerStarted","Data":"ee2ec0b0e847b19de186b8dd8d6e1ea29b0e49e8543b060582c2194a4c0976cc"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.880144 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" event={"ID":"a361e11a-9e2f-4abf-a8c1-783f328f13a9","Type":"ContainerStarted","Data":"f87c9bc6db7756626c3bb355d208bad5fa6ff7a04ae5ba924ce27ef55220d78e"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.882010 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" podStartSLOduration=129.881988732 podStartE2EDuration="2m9.881988732s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.852125312 +0000 UTC m=+155.547672421" watchObservedRunningTime="2026-01-30 08:11:56.881988732 +0000 UTC m=+155.577535841" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.882440 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" podStartSLOduration=129.882434725 podStartE2EDuration="2m9.882434725s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.879573131 +0000 UTC m=+155.575120240" watchObservedRunningTime="2026-01-30 08:11:56.882434725 +0000 UTC m=+155.577981834" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.932493 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" event={"ID":"7053ea40-6d30-41d8-bcb1-8f55e95feb22","Type":"ContainerStarted","Data":"a6cd1f74bc22f4d0359e860e5ebf6a968ae994407444db4a51eff755c334d1c8"} Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.956142 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.956596 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:56 crc kubenswrapper[4870]: E0130 08:11:56.960725 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.460689321 +0000 UTC m=+156.156236430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.990773 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2mj87" podStartSLOduration=129.990745017 podStartE2EDuration="2m9.990745017s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.941348582 +0000 UTC m=+155.636895691" watchObservedRunningTime="2026-01-30 08:11:56.990745017 +0000 UTC m=+155.686292116" Jan 30 08:11:56 crc kubenswrapper[4870]: I0130 08:11:56.995051 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nw244" podStartSLOduration=129.995018874 podStartE2EDuration="2m9.995018874s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:56.99084633 +0000 UTC m=+155.686393439" watchObservedRunningTime="2026-01-30 08:11:56.995018874 +0000 UTC m=+155.690565983" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.059206 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.079273 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.579250546 +0000 UTC m=+156.274797655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.172156 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.173008 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.672968378 +0000 UTC m=+156.368515487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.233688 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.277184 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.277698 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.777675854 +0000 UTC m=+156.473222963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.378624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.378963 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.878932088 +0000 UTC m=+156.574479197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.379723 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.380316 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.880297578 +0000 UTC m=+156.575844687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.480547 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.480715 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.980680307 +0000 UTC m=+156.676227416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.480997 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.481391 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:57.981372907 +0000 UTC m=+156.676920016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.581757 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.582177 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.082157948 +0000 UTC m=+156.777705057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.682961 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.683453 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.183436833 +0000 UTC m=+156.878983942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.757480 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:11:57 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:11:57 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:11:57 crc kubenswrapper[4870]: healthz check failed Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.757523 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.786445 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.786618 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.286593723 +0000 UTC m=+156.982140832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.786757 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.787113 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.287098578 +0000 UTC m=+156.982645697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.888160 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.888500 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.388481505 +0000 UTC m=+157.084028624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.939209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" event={"ID":"53a09b74-1b42-4535-a853-0752b6d1f90a","Type":"ContainerStarted","Data":"42ad20899b9f94ef4a43eeecc11f3c9836d9eb8489f0cd70f886594b65e81c81"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.939307 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.941421 4870 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fn5hp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.941465 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" podUID="53a09b74-1b42-4535-a853-0752b6d1f90a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.942718 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" event={"ID":"38a2c6cb-fd9d-42f6-8774-647c544bd0f9","Type":"ContainerStarted","Data":"944442945f4de7f04d0f44fb50c9908a21982de8b10fe905cf5ec5fc7f4bb92e"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.944809 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v7bvt" event={"ID":"78280554-7b5b-4ccf-a674-2664144e4f5a","Type":"ContainerStarted","Data":"f0f3f59b5cafdfae5225f86a90c6cf9ae1908ad7f3a92beb7a19c44d0f5332d1"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.945150 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.946973 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-v7bvt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.947341 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v7bvt" podUID="78280554-7b5b-4ccf-a674-2664144e4f5a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.948504 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qnjxg" event={"ID":"e64f35db-e72b-4d73-b501-7c2aff5cc609","Type":"ContainerStarted","Data":"219e045fac50283e20f42a18f9f0dfc83abd0a06efcfe2cdf1d02b73acaca0d6"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.960690 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" event={"ID":"1a6267b2-1222-4c0b-a890-c146d83b583d","Type":"ContainerStarted","Data":"a347357227dc94f9cad66b311dadd61011549ff3c1637fd2e5f1aa9526187ca5"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.960742 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" event={"ID":"1a6267b2-1222-4c0b-a890-c146d83b583d","Type":"ContainerStarted","Data":"cdda73ed7cff4b1a880889d1305b1270124964338ac480926363c03599fb46c9"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.960752 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" event={"ID":"1a6267b2-1222-4c0b-a890-c146d83b583d","Type":"ContainerStarted","Data":"b526c151540d4bd0d2349416bad116d4bf94f9f25a3754776cc42e0135c7a373"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.965551 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.965940 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.968952 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" podStartSLOduration=130.968941407 podStartE2EDuration="2m10.968941407s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:57.968254447 +0000 UTC m=+156.663801556" watchObservedRunningTime="2026-01-30 08:11:57.968941407 +0000 UTC m=+156.664488516" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.969358 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" event={"ID":"836ae3f6-06f5-4996-9f9c-cacfb63fe855","Type":"ContainerStarted","Data":"3c644e1da743db45843a7c89410d1e8d0cc7ae64f0ecbd61d7339e44b4a5a820"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.969798 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.971675 4870 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jkpz9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.971742 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" podUID="836ae3f6-06f5-4996-9f9c-cacfb63fe855" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.977388 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" event={"ID":"6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72","Type":"ContainerStarted","Data":"27068d120c5196dbb3136eda3bb61411f909927230fb86187a8aad48f808fba7"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.985145 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mnsdp" event={"ID":"a58a222f-98a0-46b4-9ea8-36a922f6a349","Type":"ContainerStarted","Data":"1b390f3142180446c1b693055b28cc2900ebef38b1a73ef2a44332d6957e844f"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.987638 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" event={"ID":"7914d7ba-40e4-4ada-9dd4-66f3d86d5dc8","Type":"ContainerStarted","Data":"1a1f2688b1bc0f658fd75296f7d41dffb74f506795545c277ef33e5ab318d257"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.990000 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:57 crc kubenswrapper[4870]: E0130 08:11:57.992206 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.492187692 +0000 UTC m=+157.187734891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.999467 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" event={"ID":"7053ea40-6d30-41d8-bcb1-8f55e95feb22","Type":"ContainerStarted","Data":"683d759e6003179c4d952f6e8ddb3547a76cf803b0597363261354aee3390294"} Jan 30 08:11:57 crc kubenswrapper[4870]: I0130 08:11:57.999543 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" event={"ID":"7053ea40-6d30-41d8-bcb1-8f55e95feb22","Type":"ContainerStarted","Data":"76081e0a15d9d1488f09e8b04bec447ab3f086735cd5d98bedf2374079c1e3c8"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.000423 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.003292 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-szhwx" event={"ID":"a8a1f91a-b48e-442f-9ab6-d704b3927315","Type":"ContainerStarted","Data":"926ec3b855a5eaa67c84948cf649ddbbf18aaa9552ca1d006f47ad74c5fccdd2"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.006205 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" event={"ID":"78a27619-258e-4bed-afb0-1706904c6f9d","Type":"ContainerStarted","Data":"d1e71e306955b9f4387a716e21ede390d73cd63bf2646abc6ae46f2bd401f303"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.012634 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-v7bvt" podStartSLOduration=131.012609984 podStartE2EDuration="2m11.012609984s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.004085063 +0000 UTC m=+156.699632172" watchObservedRunningTime="2026-01-30 08:11:58.012609984 +0000 UTC m=+156.708157083" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.024698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" event={"ID":"46d623aa-7e54-4c20-aed3-3f125395a073","Type":"ContainerStarted","Data":"c75bd9b3416c21f2110f1fd2fdc5b1f4b93aec54cc03b5e05f9582a8f0fb8b8a"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.025331 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.029006 4870 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dhpbr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.029054 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" podUID="46d623aa-7e54-4c20-aed3-3f125395a073" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.036319 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" event={"ID":"f6d9ba19-88ea-489c-9f03-918e8b225e3b","Type":"ContainerStarted","Data":"f12a45c0b2813701cf01d7ec25541511ec2cb7be0c7a627f832aa65d24825d35"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.036376 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" event={"ID":"f6d9ba19-88ea-489c-9f03-918e8b225e3b","Type":"ContainerStarted","Data":"c0b22b82bacefd0aaf817979a115cac87fd828c84574bd2a3b27b98ff1218cb9"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.041649 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" event={"ID":"e740ffac-368d-45d5-89a8-25d370581945","Type":"ContainerStarted","Data":"249e9d632198441fe97737715cfaab4c4ef133846240467d86fd289f4ffcd016"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.042835 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" event={"ID":"93fd6b37-eee2-4fd5-aa18-51eecea65a3b","Type":"ContainerStarted","Data":"2065e95b92696a8bb664d6087b11271d4f8873eafbf3cee077ccf40c2dbf8d79"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.043064 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-g6x2r" podStartSLOduration=131.043049441 podStartE2EDuration="2m11.043049441s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.041079333 +0000 UTC m=+156.736626442" watchObservedRunningTime="2026-01-30 08:11:58.043049441 +0000 UTC m=+156.738596550" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.056430 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" event={"ID":"539f32b7-3075-49f4-b9f6-e63ac1d76d61","Type":"ContainerStarted","Data":"328a07eeea88199302b8fed3e521523484c74721deba35a26a97335a3bf02afa"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.071515 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" event={"ID":"e2bdbb40-bece-420f-9ff7-bdeff90c8bd2","Type":"ContainerStarted","Data":"6cd585c0386c3e8287203c7e53c36c1a328f83d7c7bde25c1337303931a0e717"} Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.075180 4870 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kkf4z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.075224 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.090946 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.092357 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.592326384 +0000 UTC m=+157.287873493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.096800 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jh9j6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.096856 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.124474 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zghdb" podStartSLOduration=131.12445674 podStartE2EDuration="2m11.12445674s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.122521674 +0000 UTC m=+156.818068783" watchObservedRunningTime="2026-01-30 08:11:58.12445674 +0000 UTC m=+156.820003849" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.124721 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tsghw" podStartSLOduration=131.124715668 podStartE2EDuration="2m11.124715668s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.081755312 +0000 UTC m=+156.777302421" watchObservedRunningTime="2026-01-30 08:11:58.124715668 +0000 UTC m=+156.820262777" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.195679 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.216617 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.716593106 +0000 UTC m=+157.412140215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.233537 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ll2mq" podStartSLOduration=131.233499564 podStartE2EDuration="2m11.233499564s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.176473244 +0000 UTC m=+156.872020353" watchObservedRunningTime="2026-01-30 08:11:58.233499564 +0000 UTC m=+156.929046673" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.234396 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvwdd" podStartSLOduration=131.234384391 podStartE2EDuration="2m11.234384391s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.232069713 +0000 UTC m=+156.927616832" watchObservedRunningTime="2026-01-30 08:11:58.234384391 +0000 UTC m=+156.929931500" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.300776 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.301294 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.801248851 +0000 UTC m=+157.496796120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.310252 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" podStartSLOduration=131.310234996 podStartE2EDuration="2m11.310234996s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.278668986 +0000 UTC m=+156.974216105" watchObservedRunningTime="2026-01-30 08:11:58.310234996 +0000 UTC m=+157.005782105" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.322064 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.322562 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.344921 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" podStartSLOduration=131.344896507 podStartE2EDuration="2m11.344896507s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.317818909 +0000 UTC m=+157.013366018" watchObservedRunningTime="2026-01-30 08:11:58.344896507 +0000 UTC m=+157.040443616" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.373689 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xvt4n" podStartSLOduration=131.373659395 podStartE2EDuration="2m11.373659395s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.347917896 +0000 UTC m=+157.043465015" watchObservedRunningTime="2026-01-30 08:11:58.373659395 +0000 UTC m=+157.069206494" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.395286 4870 csr.go:261] certificate signing request csr-f64lz is approved, waiting to be issued Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.402012 4870 csr.go:257] certificate signing request csr-f64lz is issued Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.410409 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.410928 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:58.910910363 +0000 UTC m=+157.606457462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.416106 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-szhwx" podStartSLOduration=7.416080726 podStartE2EDuration="7.416080726s" podCreationTimestamp="2026-01-30 08:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.37756057 +0000 UTC m=+157.073107679" watchObservedRunningTime="2026-01-30 08:11:58.416080726 +0000 UTC m=+157.111627835" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.453194 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5nw" podStartSLOduration=131.453179289 podStartE2EDuration="2m11.453179289s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.450847941 +0000 UTC m=+157.146395050" watchObservedRunningTime="2026-01-30 08:11:58.453179289 +0000 UTC m=+157.148726398" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.454165 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" podStartSLOduration=131.454159717 podStartE2EDuration="2m11.454159717s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.418149206 +0000 UTC m=+157.113696315" watchObservedRunningTime="2026-01-30 08:11:58.454159717 +0000 UTC m=+157.149706816" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.483441 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" podStartSLOduration=131.48341124 podStartE2EDuration="2m11.48341124s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.483260535 +0000 UTC m=+157.178807644" watchObservedRunningTime="2026-01-30 08:11:58.48341124 +0000 UTC m=+157.178958349" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.507447 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vzzk7" podStartSLOduration=131.507411517 podStartE2EDuration="2m11.507411517s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.506318635 +0000 UTC m=+157.201865754" watchObservedRunningTime="2026-01-30 08:11:58.507411517 +0000 UTC m=+157.202958626" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.511344 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.511739 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.011723405 +0000 UTC m=+157.707270514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.565974 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-2l7mq" podStartSLOduration=131.565951302 podStartE2EDuration="2m11.565951302s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:58.561992916 +0000 UTC m=+157.257540015" watchObservedRunningTime="2026-01-30 08:11:58.565951302 +0000 UTC m=+157.261498411" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.613434 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.614106 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.114082101 +0000 UTC m=+157.809629210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.714850 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.715062 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.215023536 +0000 UTC m=+157.910570645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.715106 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.715509 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.215501 +0000 UTC m=+157.911048109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.756576 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:11:58 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:11:58 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:11:58 crc kubenswrapper[4870]: healthz check failed Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.756675 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.816328 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.816609 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.316565179 +0000 UTC m=+158.012112288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.816965 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.817328 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.317313151 +0000 UTC m=+158.012860260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.867930 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.918092 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.918333 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.418292357 +0000 UTC m=+158.113839476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:58 crc kubenswrapper[4870]: I0130 08:11:58.918427 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:58 crc kubenswrapper[4870]: E0130 08:11:58.918962 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.418949246 +0000 UTC m=+158.114496535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.020510 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.020726 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.520698945 +0000 UTC m=+158.216246054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.020835 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.021147 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.521139868 +0000 UTC m=+158.216686977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.080195 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mnsdp" event={"ID":"a58a222f-98a0-46b4-9ea8-36a922f6a349","Type":"ContainerStarted","Data":"0bb5c8bc6036bcb97d1a72d9dbc36f24717b2c952da882e0470f6832f1b5ef82"} Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.080399 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mnsdp" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.083108 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" event={"ID":"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca","Type":"ContainerStarted","Data":"03e2b42f7c8b9aa3ab6e729b31a40a3da732232ecf1e060e15a754b40c3c4965"} Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.084921 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-v7bvt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.084987 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v7bvt" podUID="78280554-7b5b-4ccf-a674-2664144e4f5a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.084932 4870 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jh9j6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.085336 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.094431 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn5hp" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.099399 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brrxb" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.100025 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jkpz9" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.113031 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mnsdp" podStartSLOduration=9.113006226 podStartE2EDuration="9.113006226s" podCreationTimestamp="2026-01-30 08:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:11:59.110950935 +0000 UTC m=+157.806498044" watchObservedRunningTime="2026-01-30 08:11:59.113006226 +0000 UTC m=+157.808553335" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.121699 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.122007 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.62196321 +0000 UTC m=+158.317510319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.122301 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.122787 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.622778184 +0000 UTC m=+158.318325283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.123300 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.223592 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.223796 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.72376531 +0000 UTC m=+158.419312409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.224169 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.226493 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.72647079 +0000 UTC m=+158.422017899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.328799 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.329843 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.829818196 +0000 UTC m=+158.525365305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.403989 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 08:06:58 +0000 UTC, rotation deadline is 2026-10-16 22:45:05.123217466 +0000 UTC Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.404041 4870 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6230h33m5.719178943s for next certificate rotation Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.433575 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.433962 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:11:59.933950024 +0000 UTC m=+158.629497133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.535038 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.535432 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.035410565 +0000 UTC m=+158.730957664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.636654 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.637086 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.137069961 +0000 UTC m=+158.832617070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.697429 4870 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2g2tj container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]log ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]etcd ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/max-in-flight-filter ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 30 08:11:59 crc kubenswrapper[4870]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 30 08:11:59 crc kubenswrapper[4870]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/project.openshift.io-projectcache ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/openshift.io-startinformers ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 30 08:11:59 crc kubenswrapper[4870]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 08:11:59 crc kubenswrapper[4870]: livez check failed Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.697529 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" podUID="0373f9a1-1537-4f29-905a-b0fb2affc113" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.737911 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.738164 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.23811294 +0000 UTC m=+158.933660049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.738209 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.738623 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.238614284 +0000 UTC m=+158.934161393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.760847 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:11:59 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:11:59 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:11:59 crc kubenswrapper[4870]: healthz check failed Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.760975 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.839518 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.840491 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.340459035 +0000 UTC m=+159.036006144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.882848 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dhpbr" Jan 30 08:11:59 crc kubenswrapper[4870]: I0130 08:11:59.945627 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:11:59 crc kubenswrapper[4870]: E0130 08:11:59.946130 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.446114219 +0000 UTC m=+159.141661318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.046833 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.047275 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.54725174 +0000 UTC m=+159.242798839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.089725 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" event={"ID":"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca","Type":"ContainerStarted","Data":"2b70301241ea60405c8b0b8013c7390193ae81f6f401fa7f63a18489051b6db7"} Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.148553 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.150979 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.650964977 +0000 UTC m=+159.346512086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.249691 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.250108 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.750085799 +0000 UTC m=+159.445632908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.353538 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.353980 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.85396312 +0000 UTC m=+159.549510229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.354155 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.364532 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.367422 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.377212 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.433332 4870 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.454434 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.454664 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.454688 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg24l\" (UniqueName: \"kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.454721 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.954688059 +0000 UTC m=+159.650235168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.454783 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.454909 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.455526 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:00.955518254 +0000 UTC m=+159.651065573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.521218 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cx2x5"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.522539 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.538778 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.555847 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.556135 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.556160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg24l\" (UniqueName: \"kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.556203 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.556616 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.556695 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.056679934 +0000 UTC m=+159.752227043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.556915 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.562171 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx2x5"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.606864 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg24l\" (UniqueName: \"kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l\") pod \"certified-operators-rk4lj\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.659362 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.659493 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.659552 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.659603 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44g22\" (UniqueName: \"kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.660033 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.16001456 +0000 UTC m=+159.855561669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.685416 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.728138 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.729067 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.754905 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:00 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:00 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:00 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.754980 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.761697 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.762419 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.762563 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.762626 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44g22\" (UniqueName: \"kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.763530 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.763597 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.764645 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.264622213 +0000 UTC m=+159.960169322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.767709 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.846079 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44g22\" (UniqueName: \"kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22\") pod \"community-operators-cx2x5\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.864021 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.864088 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlfz\" (UniqueName: \"kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.864151 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.864187 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.864631 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.36460598 +0000 UTC m=+160.060153149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.926425 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.927768 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.948736 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.970753 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.970977 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971007 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971045 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971084 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971102 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhq7x\" (UniqueName: \"kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971128 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlfz\" (UniqueName: \"kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.971791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:00 crc kubenswrapper[4870]: E0130 08:12:00.971956 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.471925113 +0000 UTC m=+160.167472222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:00 crc kubenswrapper[4870]: I0130 08:12:00.979054 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.026817 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlfz\" (UniqueName: \"kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz\") pod \"certified-operators-sdlrf\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.053340 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.075443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.075488 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.075528 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.075558 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhq7x\" (UniqueName: \"kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.076644 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: E0130 08:12:01.077138 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.577116253 +0000 UTC m=+160.272663362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sfs65" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.077598 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.094608 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhq7x\" (UniqueName: \"kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x\") pod \"community-operators-vm685\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.121781 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" event={"ID":"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca","Type":"ContainerStarted","Data":"14c69f5ef2c2a59869866261a25048d69105089421652bcd3d4a89fbbc8d330f"} Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.121837 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" event={"ID":"20fcc16b-f2b2-4a33-a8b2-567bec77d7ca","Type":"ContainerStarted","Data":"7f406aa6747911baa02be123e8dcdce8d7bdbcfb4199af028d16098d4b5732c3"} Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.142439 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.177518 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.177946 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-k7wnt" podStartSLOduration=11.177917014 podStartE2EDuration="11.177917014s" podCreationTimestamp="2026-01-30 08:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:01.174636038 +0000 UTC m=+159.870183147" watchObservedRunningTime="2026-01-30 08:12:01.177917014 +0000 UTC m=+159.873464123" Jan 30 08:12:01 crc kubenswrapper[4870]: E0130 08:12:01.178649 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 08:12:01.678632365 +0000 UTC m=+160.374179474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.202416 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.269440 4870 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T08:12:00.43336164Z","Handler":null,"Name":""} Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.278985 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.279050 4870 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.279089 4870 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.284756 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.286469 4870 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.286503 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.334237 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sfs65\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.380849 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.401043 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.435399 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.487509 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.686014 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx2x5"] Jan 30 08:12:01 crc kubenswrapper[4870]: W0130 08:12:01.704248 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258d3e35_5580_4108_889c_9d5d2f80c810.slice/crio-ea85190d876bcbca144726c237a14b6d31ba3248e8f165a1e622d666e72b6022 WatchSource:0}: Error finding container ea85190d876bcbca144726c237a14b6d31ba3248e8f165a1e622d666e72b6022: Status 404 returned error can't find the container with id ea85190d876bcbca144726c237a14b6d31ba3248e8f165a1e622d666e72b6022 Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.770996 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.780542 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:01 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:01 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:01 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.780596 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.856362 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:12:01 crc kubenswrapper[4870]: I0130 08:12:01.949283 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8nfd4" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.083911 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.133847 4870 generic.go:334] "Generic (PLEG): container finished" podID="93fd6b37-eee2-4fd5-aa18-51eecea65a3b" containerID="2065e95b92696a8bb664d6087b11271d4f8873eafbf3cee077ccf40c2dbf8d79" exitCode=0 Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.134005 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" event={"ID":"93fd6b37-eee2-4fd5-aa18-51eecea65a3b","Type":"ContainerDied","Data":"2065e95b92696a8bb664d6087b11271d4f8873eafbf3cee077ccf40c2dbf8d79"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.137017 4870 generic.go:334] "Generic (PLEG): container finished" podID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerID="1ecf5db22e2b1fa8547549ce582ecddde377bbe670b1f97e03e9e9e6f42d4dae" exitCode=0 Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.137690 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerDied","Data":"1ecf5db22e2b1fa8547549ce582ecddde377bbe670b1f97e03e9e9e6f42d4dae"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.137798 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerStarted","Data":"17d6a9bdca6c16fe2977a640455c60bcc06dd2ad4ecdc2b9c6411506d215c0be"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.143749 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.144663 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" event={"ID":"406fb8be-c783-4ef8-8aae-5430b0226d17","Type":"ContainerStarted","Data":"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.144762 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" event={"ID":"406fb8be-c783-4ef8-8aae-5430b0226d17","Type":"ContainerStarted","Data":"47d84e04f9b3f93637b83fdd855c471e56293ba330cba3caf1369ea3f8340bb4"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.144931 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.148313 4870 generic.go:334] "Generic (PLEG): container finished" podID="abc41080-75c5-421f-baa8-f05792f74564" containerID="c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354" exitCode=0 Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.148404 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerDied","Data":"c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.148444 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerStarted","Data":"96518355e0bd9b243a322652ed93adea62f75e712bc08772e1f193f3dde1d1a9"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.151045 4870 generic.go:334] "Generic (PLEG): container finished" podID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerID="2e15cf3e43d60efa400786600f10aabddcac1a402cf20155c96332c4d505ad73" exitCode=0 Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.151119 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerDied","Data":"2e15cf3e43d60efa400786600f10aabddcac1a402cf20155c96332c4d505ad73"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.151148 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerStarted","Data":"b4deb94680d10a0e49b737adc1e5d0d479b58878615ce9ba8009bd204fb58e39"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.158499 4870 generic.go:334] "Generic (PLEG): container finished" podID="258d3e35-5580-4108-889c-9d5d2f80c810" containerID="b8e1ab4ce4d07cf81dd3964239182751d6d8a8cb595e0cabe44b1efd32e0f612" exitCode=0 Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.159832 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerDied","Data":"b8e1ab4ce4d07cf81dd3964239182751d6d8a8cb595e0cabe44b1efd32e0f612"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.159860 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerStarted","Data":"ea85190d876bcbca144726c237a14b6d31ba3248e8f165a1e622d666e72b6022"} Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.335456 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jqng8"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.337316 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.358426 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.367741 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqng8"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.378200 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.379116 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.399317 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.399347 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.426287 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.429314 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6vk\" (UniqueName: \"kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.429501 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.429615 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.429731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.429818 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.532819 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.532923 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.532954 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.532973 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6vk\" (UniqueName: \"kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.533006 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.533536 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.533591 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.533640 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.558716 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.566918 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6vk\" (UniqueName: \"kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk\") pod \"redhat-marketplace-jqng8\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.579717 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" podStartSLOduration=135.579693398 podStartE2EDuration="2m15.579693398s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:02.559803582 +0000 UTC m=+161.255350691" watchObservedRunningTime="2026-01-30 08:12:02.579693398 +0000 UTC m=+161.275240507" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.667794 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.714303 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.715893 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.716799 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.734498 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.735555 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.735681 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.735749 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sz97\" (UniqueName: \"kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.761428 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:02 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:02 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:02 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.761603 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.837864 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.838570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.838640 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sz97\" (UniqueName: \"kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.840169 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.840421 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.862059 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sz97\" (UniqueName: \"kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97\") pod \"redhat-marketplace-ddg46\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.970474 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:12:02 crc kubenswrapper[4870]: I0130 08:12:02.979121 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2g2tj" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.008054 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqng8"] Jan 30 08:12:03 crc kubenswrapper[4870]: W0130 08:12:03.009421 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cb5ce8_da4f_4c24_9805_18a91b316bcd.slice/crio-5318a5759e8a4ecffb11be37d9689df0b960dc674f99fd5d3cb764e4f3066de3 WatchSource:0}: Error finding container 5318a5759e8a4ecffb11be37d9689df0b960dc674f99fd5d3cb764e4f3066de3: Status 404 returned error can't find the container with id 5318a5759e8a4ecffb11be37d9689df0b960dc674f99fd5d3cb764e4f3066de3 Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.041386 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.130282 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.176547 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerStarted","Data":"5318a5759e8a4ecffb11be37d9689df0b960dc674f99fd5d3cb764e4f3066de3"} Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.231304 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.239091 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.242151 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.244624 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.245156 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.349173 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.349291 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.457454 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.457707 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.457652 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.484893 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.524757 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.560196 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume\") pod \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.560510 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qp2d\" (UniqueName: \"kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d\") pod \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.560703 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume\") pod \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\" (UID: \"93fd6b37-eee2-4fd5-aa18-51eecea65a3b\") " Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.561846 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume" (OuterVolumeSpecName: "config-volume") pod "93fd6b37-eee2-4fd5-aa18-51eecea65a3b" (UID: "93fd6b37-eee2-4fd5-aa18-51eecea65a3b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.567529 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "93fd6b37-eee2-4fd5-aa18-51eecea65a3b" (UID: "93fd6b37-eee2-4fd5-aa18-51eecea65a3b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.568108 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d" (OuterVolumeSpecName: "kube-api-access-9qp2d") pod "93fd6b37-eee2-4fd5-aa18-51eecea65a3b" (UID: "93fd6b37-eee2-4fd5-aa18-51eecea65a3b"). InnerVolumeSpecName "kube-api-access-9qp2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.568638 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.627449 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.627501 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.643167 4870 patch_prober.go:28] interesting pod/console-f9d7485db-2mj87 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.643265 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2mj87" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.668291 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.668325 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.668334 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qp2d\" (UniqueName: \"kubernetes.io/projected/93fd6b37-eee2-4fd5-aa18-51eecea65a3b-kube-api-access-9qp2d\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.690515 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.710856 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-85lwg"] Jan 30 08:12:03 crc kubenswrapper[4870]: E0130 08:12:03.711405 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fd6b37-eee2-4fd5-aa18-51eecea65a3b" containerName="collect-profiles" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.711432 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fd6b37-eee2-4fd5-aa18-51eecea65a3b" containerName="collect-profiles" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.711630 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fd6b37-eee2-4fd5-aa18-51eecea65a3b" containerName="collect-profiles" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.712899 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.715546 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.719789 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-85lwg"] Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.750777 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.768378 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:03 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:03 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:03 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.768447 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.785304 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-v7bvt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.785397 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v7bvt" podUID="78280554-7b5b-4ccf-a674-2664144e4f5a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.786219 4870 patch_prober.go:28] interesting pod/downloads-7954f5f757-v7bvt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.786293 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v7bvt" podUID="78280554-7b5b-4ccf-a674-2664144e4f5a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.852379 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.879303 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drv8z\" (UniqueName: \"kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.879458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.879542 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.980841 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drv8z\" (UniqueName: \"kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.980923 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.980986 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.981632 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:03 crc kubenswrapper[4870]: I0130 08:12:03.981718 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.008429 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drv8z\" (UniqueName: \"kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z\") pod \"redhat-operators-85lwg\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.114011 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 08:12:04 crc kubenswrapper[4870]: W0130 08:12:04.120257 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd0bcbce8_6f90_4ccb_b5b6_163a3dd53675.slice/crio-738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd WatchSource:0}: Error finding container 738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd: Status 404 returned error can't find the container with id 738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.124859 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.128154 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.133235 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.144401 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.186590 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.186669 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.186713 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmxfx\" (UniqueName: \"kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.237696 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675","Type":"ContainerStarted","Data":"738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.274178 4870 generic.go:334] "Generic (PLEG): container finished" podID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerID="3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c" exitCode=0 Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.274259 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerDied","Data":"3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.274292 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerStarted","Data":"a0bdc36a8576d5c25a0097622d42f72393c74577381da880313d27ca87e33cc7"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.288701 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmxfx\" (UniqueName: \"kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.288932 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.289819 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.291400 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.291972 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.300331 4870 generic.go:334] "Generic (PLEG): container finished" podID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerID="8a3a4ecde2801a20f3bb4ccdc68bab1d46b831e5569a15eb1e5876330bbb7d42" exitCode=0 Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.300598 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerDied","Data":"8a3a4ecde2801a20f3bb4ccdc68bab1d46b831e5569a15eb1e5876330bbb7d42"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.314786 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.314831 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92" event={"ID":"93fd6b37-eee2-4fd5-aa18-51eecea65a3b","Type":"ContainerDied","Data":"1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.314910 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d4f6287093b36cb2637cb372b313eb6bd9562c5fa5d2aeb1ae5bac20a51d619" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.340513 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmxfx\" (UniqueName: \"kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx\") pod \"redhat-operators-nxbdr\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.345096 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"442fd418-e9e8-4cda-8e47-0a2780ae306d","Type":"ContainerStarted","Data":"b8cd20c1ed0f195e42077335953b04003fcd5a5b8e38705335ea3a93348af2c9"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.345141 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"442fd418-e9e8-4cda-8e47-0a2780ae306d","Type":"ContainerStarted","Data":"2d9181b27bd439ecb45b4b635aa0db9190165a3b423f5ddca9eedee27fed1520"} Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.393499 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.393478605 podStartE2EDuration="2.393478605s" podCreationTimestamp="2026-01-30 08:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:04.390970392 +0000 UTC m=+163.086517501" watchObservedRunningTime="2026-01-30 08:12:04.393478605 +0000 UTC m=+163.089025714" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.525552 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.653143 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-85lwg"] Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.756370 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:04 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:04 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:04 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:04 crc kubenswrapper[4870]: I0130 08:12:04.756472 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.039940 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.358237 4870 generic.go:334] "Generic (PLEG): container finished" podID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerID="f4984448372f3c99bd2eb627d2f6a37eee0cab48c315336c3d5192e15f6bb85e" exitCode=0 Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.358313 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerDied","Data":"f4984448372f3c99bd2eb627d2f6a37eee0cab48c315336c3d5192e15f6bb85e"} Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.358628 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerStarted","Data":"1b14874ab64bd9943b3954bf834f4ae30ab6a234601d5bd7fe08c6631f1c0819"} Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.361262 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerStarted","Data":"75dc4d4ca08c96b6af316cef86b49419d2a6ad7374d685b482b8ff2fed0aeb65"} Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.366936 4870 generic.go:334] "Generic (PLEG): container finished" podID="442fd418-e9e8-4cda-8e47-0a2780ae306d" containerID="b8cd20c1ed0f195e42077335953b04003fcd5a5b8e38705335ea3a93348af2c9" exitCode=0 Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.367111 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"442fd418-e9e8-4cda-8e47-0a2780ae306d","Type":"ContainerDied","Data":"b8cd20c1ed0f195e42077335953b04003fcd5a5b8e38705335ea3a93348af2c9"} Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.371464 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675","Type":"ContainerStarted","Data":"a6b7b751164e39de50a2a14218aad750e231a678e095709e7e9878cf8f73fa45"} Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.426485 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.42645124 podStartE2EDuration="2.42645124s" podCreationTimestamp="2026-01-30 08:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:05.424283346 +0000 UTC m=+164.119830465" watchObservedRunningTime="2026-01-30 08:12:05.42645124 +0000 UTC m=+164.121998349" Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.754277 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:05 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:05 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:05 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:05 crc kubenswrapper[4870]: I0130 08:12:05.754354 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.052131 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.401775 4870 generic.go:334] "Generic (PLEG): container finished" podID="f5255b75-6d10-40f0-9d11-c975458382cb" containerID="91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381" exitCode=0 Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.401834 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerDied","Data":"91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381"} Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.450443 4870 generic.go:334] "Generic (PLEG): container finished" podID="d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" containerID="a6b7b751164e39de50a2a14218aad750e231a678e095709e7e9878cf8f73fa45" exitCode=0 Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.450977 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675","Type":"ContainerDied","Data":"a6b7b751164e39de50a2a14218aad750e231a678e095709e7e9878cf8f73fa45"} Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.755743 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:06 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:06 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:06 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.756217 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.936201 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.962435 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access\") pod \"442fd418-e9e8-4cda-8e47-0a2780ae306d\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.962565 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir\") pod \"442fd418-e9e8-4cda-8e47-0a2780ae306d\" (UID: \"442fd418-e9e8-4cda-8e47-0a2780ae306d\") " Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.962694 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "442fd418-e9e8-4cda-8e47-0a2780ae306d" (UID: "442fd418-e9e8-4cda-8e47-0a2780ae306d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.963010 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/442fd418-e9e8-4cda-8e47-0a2780ae306d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:06 crc kubenswrapper[4870]: I0130 08:12:06.989392 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "442fd418-e9e8-4cda-8e47-0a2780ae306d" (UID: "442fd418-e9e8-4cda-8e47-0a2780ae306d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.069712 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/442fd418-e9e8-4cda-8e47-0a2780ae306d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.485357 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.486339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"442fd418-e9e8-4cda-8e47-0a2780ae306d","Type":"ContainerDied","Data":"2d9181b27bd439ecb45b4b635aa0db9190165a3b423f5ddca9eedee27fed1520"} Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.486377 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d9181b27bd439ecb45b4b635aa0db9190165a3b423f5ddca9eedee27fed1520" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.757289 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:07 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:07 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:07 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.757356 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.969215 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.990223 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access\") pod \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.990334 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir\") pod \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\" (UID: \"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675\") " Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.990462 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" (UID: "d0bcbce8-6f90-4ccb-b5b6-163a3dd53675"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:12:07 crc kubenswrapper[4870]: I0130 08:12:07.990581 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:07.995901 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" (UID: "d0bcbce8-6f90-4ccb-b5b6-163a3dd53675"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.092840 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bcbce8-6f90-4ccb-b5b6-163a3dd53675-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.523593 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0bcbce8-6f90-4ccb-b5b6-163a3dd53675","Type":"ContainerDied","Data":"738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd"} Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.524165 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="738b24dab04ad1c07a253fcd86b532a3c395508eed7bde7827792ecb7b2579bd" Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.523689 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.754125 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:08 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:08 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:08 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:08 crc kubenswrapper[4870]: I0130 08:12:08.754496 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.152573 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mnsdp" Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.630630 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.657021 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b976744-b72d-4291-a32f-437fc1cfbf03-metrics-certs\") pod \"network-metrics-daemon-mp9vw\" (UID: \"7b976744-b72d-4291-a32f-437fc1cfbf03\") " pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.754915 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:09 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:09 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:09 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.755075 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:09 crc kubenswrapper[4870]: I0130 08:12:09.895220 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mp9vw" Jan 30 08:12:10 crc kubenswrapper[4870]: I0130 08:12:10.335729 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mp9vw"] Jan 30 08:12:10 crc kubenswrapper[4870]: I0130 08:12:10.580089 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" event={"ID":"7b976744-b72d-4291-a32f-437fc1cfbf03","Type":"ContainerStarted","Data":"e11e3efb615e509dfc9d07377a0b4baa70f7d89635ec493f7c7ad084d1c2a8bf"} Jan 30 08:12:10 crc kubenswrapper[4870]: I0130 08:12:10.754607 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:10 crc kubenswrapper[4870]: [-]has-synced failed: reason withheld Jan 30 08:12:10 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:10 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:10 crc kubenswrapper[4870]: I0130 08:12:10.754673 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:11 crc kubenswrapper[4870]: I0130 08:12:11.608266 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" event={"ID":"7b976744-b72d-4291-a32f-437fc1cfbf03","Type":"ContainerStarted","Data":"8e045380b35ca46680cb58a41fe659499348b8347b986bd2735ff522e36d555d"} Jan 30 08:12:11 crc kubenswrapper[4870]: I0130 08:12:11.754548 4870 patch_prober.go:28] interesting pod/router-default-5444994796-dfwzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 08:12:11 crc kubenswrapper[4870]: [+]has-synced ok Jan 30 08:12:11 crc kubenswrapper[4870]: [+]process-running ok Jan 30 08:12:11 crc kubenswrapper[4870]: healthz check failed Jan 30 08:12:11 crc kubenswrapper[4870]: I0130 08:12:11.754680 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dfwzs" podUID="fe4278ad-53ec-4f7f-9c39-a00b6fa505c5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 08:12:12 crc kubenswrapper[4870]: I0130 08:12:12.753928 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:12:12 crc kubenswrapper[4870]: I0130 08:12:12.757752 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dfwzs" Jan 30 08:12:13 crc kubenswrapper[4870]: I0130 08:12:13.632621 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:12:13 crc kubenswrapper[4870]: I0130 08:12:13.637221 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:12:13 crc kubenswrapper[4870]: I0130 08:12:13.803954 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-v7bvt" Jan 30 08:12:16 crc kubenswrapper[4870]: I0130 08:12:16.105496 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:12:16 crc kubenswrapper[4870]: I0130 08:12:16.106082 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" containerID="cri-o://a77c631c25ab71f0cef3c69513d9b0866e7e0d3305252072a16f62fc9dac93dd" gracePeriod=30 Jan 30 08:12:16 crc kubenswrapper[4870]: I0130 08:12:16.111596 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:12:16 crc kubenswrapper[4870]: I0130 08:12:16.111820 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" containerID="cri-o://1c707abb6d42fe8dbbf92f521b8a55ebafd7443ac15de36d0828dd259789e664" gracePeriod=30 Jan 30 08:12:17 crc kubenswrapper[4870]: I0130 08:12:17.664971 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerID="a77c631c25ab71f0cef3c69513d9b0866e7e0d3305252072a16f62fc9dac93dd" exitCode=0 Jan 30 08:12:17 crc kubenswrapper[4870]: I0130 08:12:17.665033 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" event={"ID":"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7","Type":"ContainerDied","Data":"a77c631c25ab71f0cef3c69513d9b0866e7e0d3305252072a16f62fc9dac93dd"} Jan 30 08:12:17 crc kubenswrapper[4870]: I0130 08:12:17.667302 4870 generic.go:334] "Generic (PLEG): container finished" podID="c488e93c-573d-4d04-a272-699af1059a0e" containerID="1c707abb6d42fe8dbbf92f521b8a55ebafd7443ac15de36d0828dd259789e664" exitCode=0 Jan 30 08:12:17 crc kubenswrapper[4870]: I0130 08:12:17.667356 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" event={"ID":"c488e93c-573d-4d04-a272-699af1059a0e","Type":"ContainerDied","Data":"1c707abb6d42fe8dbbf92f521b8a55ebafd7443ac15de36d0828dd259789e664"} Jan 30 08:12:19 crc kubenswrapper[4870]: I0130 08:12:19.120007 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 08:12:21 crc kubenswrapper[4870]: I0130 08:12:21.441033 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.384687 4870 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8p957 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.386402 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.661577 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.664678 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.679089 4870 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kkf4z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.679191 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708056 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:24 crc kubenswrapper[4870]: E0130 08:12:24.708283 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708303 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: E0130 08:12:24.708315 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708322 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: E0130 08:12:24.708339 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442fd418-e9e8-4cda-8e47-0a2780ae306d" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708347 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="442fd418-e9e8-4cda-8e47-0a2780ae306d" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: E0130 08:12:24.708361 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708368 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708483 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="442fd418-e9e8-4cda-8e47-0a2780ae306d" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708499 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bcbce8-6f90-4ccb-b5b6-163a3dd53675" containerName="pruner" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708508 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c488e93c-573d-4d04-a272-699af1059a0e" containerName="route-controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708518 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" containerName="controller-manager" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.708856 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.712291 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" event={"ID":"c488e93c-573d-4d04-a272-699af1059a0e","Type":"ContainerDied","Data":"704010f76326b14b45acb49d52a3c39fd09423589bc0b99052ca69b69f06912c"} Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.712327 4870 scope.go:117] "RemoveContainer" containerID="1c707abb6d42fe8dbbf92f521b8a55ebafd7443ac15de36d0828dd259789e664" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.712412 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.721156 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" event={"ID":"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7","Type":"ContainerDied","Data":"c3d25320838bc55388c93ea63e175ab91cf4b33328f8715faa7380d5ef4ae27f"} Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.721305 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kkf4z" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.736950 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.782802 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca\") pod \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.782931 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frbv4\" (UniqueName: \"kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4\") pod \"c488e93c-573d-4d04-a272-699af1059a0e\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783003 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx49z\" (UniqueName: \"kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z\") pod \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783059 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert\") pod \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783120 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca\") pod \"c488e93c-573d-4d04-a272-699af1059a0e\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783230 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config\") pod \"c488e93c-573d-4d04-a272-699af1059a0e\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783268 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config\") pod \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783318 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles\") pod \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\" (UID: \"3e75fb87-2eda-4658-a5dc-fb9424ed9cb7\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783366 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert\") pod \"c488e93c-573d-4d04-a272-699af1059a0e\" (UID: \"c488e93c-573d-4d04-a272-699af1059a0e\") " Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783583 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cnks\" (UniqueName: \"kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783612 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca" (OuterVolumeSpecName: "client-ca") pod "c488e93c-573d-4d04-a272-699af1059a0e" (UID: "c488e93c-573d-4d04-a272-699af1059a0e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783722 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783777 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.783945 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784100 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784377 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" (UID: "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784101 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config" (OuterVolumeSpecName: "config") pod "c488e93c-573d-4d04-a272-699af1059a0e" (UID: "c488e93c-573d-4d04-a272-699af1059a0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784594 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784474 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config" (OuterVolumeSpecName: "config") pod "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" (UID: "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.784675 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca" (OuterVolumeSpecName: "client-ca") pod "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" (UID: "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.789598 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4" (OuterVolumeSpecName: "kube-api-access-frbv4") pod "c488e93c-573d-4d04-a272-699af1059a0e" (UID: "c488e93c-573d-4d04-a272-699af1059a0e"). InnerVolumeSpecName "kube-api-access-frbv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.789602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" (UID: "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.789663 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c488e93c-573d-4d04-a272-699af1059a0e" (UID: "c488e93c-573d-4d04-a272-699af1059a0e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.791253 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z" (OuterVolumeSpecName: "kube-api-access-nx49z") pod "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" (UID: "3e75fb87-2eda-4658-a5dc-fb9424ed9cb7"). InnerVolumeSpecName "kube-api-access-nx49z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886269 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cnks\" (UniqueName: \"kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886358 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886394 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886434 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886477 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886545 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c488e93c-573d-4d04-a272-699af1059a0e-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886564 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886578 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886595 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c488e93c-573d-4d04-a272-699af1059a0e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886611 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886623 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frbv4\" (UniqueName: \"kubernetes.io/projected/c488e93c-573d-4d04-a272-699af1059a0e-kube-api-access-frbv4\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886635 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx49z\" (UniqueName: \"kubernetes.io/projected/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-kube-api-access-nx49z\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.886647 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.888054 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.888135 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.889409 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.895113 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:24 crc kubenswrapper[4870]: I0130 08:12:24.906078 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cnks\" (UniqueName: \"kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks\") pod \"controller-manager-8596c5f99c-f2skc\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.040323 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.070045 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.080180 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8p957"] Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.086436 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.091225 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kkf4z"] Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.250225 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:12:25 crc kubenswrapper[4870]: I0130 08:12:25.250736 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:12:26 crc kubenswrapper[4870]: I0130 08:12:26.081633 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e75fb87-2eda-4658-a5dc-fb9424ed9cb7" path="/var/lib/kubelet/pods/3e75fb87-2eda-4658-a5dc-fb9424ed9cb7/volumes" Jan 30 08:12:26 crc kubenswrapper[4870]: I0130 08:12:26.082627 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c488e93c-573d-4d04-a272-699af1059a0e" path="/var/lib/kubelet/pods/c488e93c-573d-4d04-a272-699af1059a0e/volumes" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.173849 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.175024 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.178850 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.179530 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.180329 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.180336 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.180485 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.182144 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.186662 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.224379 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.224503 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276q6\" (UniqueName: \"kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.224560 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.224689 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.326381 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.326445 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276q6\" (UniqueName: \"kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.326491 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.326600 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.327951 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.330032 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.332476 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.348389 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276q6\" (UniqueName: \"kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6\") pod \"route-controller-manager-6746697959-z2jtn\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:27 crc kubenswrapper[4870]: I0130 08:12:27.512216 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.908571 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.909652 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhq7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vm685_openshift-marketplace(abc41080-75c5-421f-baa8-f05792f74564): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.910827 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vm685" podUID="abc41080-75c5-421f-baa8-f05792f74564" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.951501 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.951665 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4sz97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ddg46_openshift-marketplace(025ee8c8-8a97-4158-88fb-c4fa23f5c9c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 08:12:33 crc kubenswrapper[4870]: E0130 08:12:33.952857 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ddg46" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" Jan 30 08:12:34 crc kubenswrapper[4870]: I0130 08:12:34.087571 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-26vrg" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.592812 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ddg46" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.593153 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vm685" podUID="abc41080-75c5-421f-baa8-f05792f74564" Jan 30 08:12:35 crc kubenswrapper[4870]: I0130 08:12:35.647636 4870 scope.go:117] "RemoveContainer" containerID="a77c631c25ab71f0cef3c69513d9b0866e7e0d3305252072a16f62fc9dac93dd" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.727096 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.727858 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.729163 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44g22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cx2x5_openshift-marketplace(258d3e35-5580-4108-889c-9d5d2f80c810): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.729301 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rg24l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rk4lj_openshift-marketplace(ba2950a4-e1b9-45a9-9980-1b4169e0fb16): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.731583 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cx2x5" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.731694 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rk4lj" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.757670 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.758276 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wlfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sdlrf_openshift-marketplace(e02d35f8-2e8c-47a3-87c9-9580ab766290): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.759620 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sdlrf" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.814338 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cx2x5" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.814339 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rk4lj" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" Jan 30 08:12:35 crc kubenswrapper[4870]: E0130 08:12:35.814390 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sdlrf" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" Jan 30 08:12:35 crc kubenswrapper[4870]: I0130 08:12:35.923042 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:35 crc kubenswrapper[4870]: W0130 08:12:35.939034 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda17e1099_eed8_4519_af45_260df6408a0b.slice/crio-c3a5d53e77564e84f7b91da983d0401bd24c456bb36029d2060aa1e9d103cd82 WatchSource:0}: Error finding container c3a5d53e77564e84f7b91da983d0401bd24c456bb36029d2060aa1e9d103cd82: Status 404 returned error can't find the container with id c3a5d53e77564e84f7b91da983d0401bd24c456bb36029d2060aa1e9d103cd82 Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.048572 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.106507 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.207750 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.804688 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" event={"ID":"a17e1099-eed8-4519-af45-260df6408a0b","Type":"ContainerStarted","Data":"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.804973 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" event={"ID":"a17e1099-eed8-4519-af45-260df6408a0b","Type":"ContainerStarted","Data":"c3a5d53e77564e84f7b91da983d0401bd24c456bb36029d2060aa1e9d103cd82"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.805074 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" podUID="a17e1099-eed8-4519-af45-260df6408a0b" containerName="controller-manager" containerID="cri-o://c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab" gracePeriod=30 Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.805711 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.808861 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mp9vw" event={"ID":"7b976744-b72d-4291-a32f-437fc1cfbf03","Type":"ContainerStarted","Data":"e62f5616c575397527b0b07778b565294e2dac939498e9fd49ba103ed954c034"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.810569 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.816043 4870 generic.go:334] "Generic (PLEG): container finished" podID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerID="6734abf7e123160f7f9ec15e63bcacb2803b7e9b5f597cb9ce9439f6abad0e28" exitCode=0 Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.816230 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerDied","Data":"6734abf7e123160f7f9ec15e63bcacb2803b7e9b5f597cb9ce9439f6abad0e28"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.821712 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerStarted","Data":"d0dc443c5c9b20693d4448270af7993e64d959e03cda7b880c0de95b2ee5007b"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.824392 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerStarted","Data":"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.826010 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" event={"ID":"5914070f-d811-4c53-962e-62e819772201","Type":"ContainerStarted","Data":"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.826037 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" event={"ID":"5914070f-d811-4c53-962e-62e819772201","Type":"ContainerStarted","Data":"27e7288fbc9550c8c94128f62c3018e54a3ae3ca4153343d75341c7c9ed7ce95"} Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.826093 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.826058 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" podUID="5914070f-d811-4c53-962e-62e819772201" containerName="route-controller-manager" containerID="cri-o://922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f" gracePeriod=30 Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.833145 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" podStartSLOduration=20.833124277 podStartE2EDuration="20.833124277s" podCreationTimestamp="2026-01-30 08:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:36.828903805 +0000 UTC m=+195.524450914" watchObservedRunningTime="2026-01-30 08:12:36.833124277 +0000 UTC m=+195.528671386" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.837757 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.866958 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mp9vw" podStartSLOduration=169.866928239 podStartE2EDuration="2m49.866928239s" podCreationTimestamp="2026-01-30 08:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:36.866903528 +0000 UTC m=+195.562450637" watchObservedRunningTime="2026-01-30 08:12:36.866928239 +0000 UTC m=+195.562475348" Jan 30 08:12:36 crc kubenswrapper[4870]: I0130 08:12:36.933508 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" podStartSLOduration=20.933483852 podStartE2EDuration="20.933483852s" podCreationTimestamp="2026-01-30 08:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:36.929222439 +0000 UTC m=+195.624769558" watchObservedRunningTime="2026-01-30 08:12:36.933483852 +0000 UTC m=+195.629030961" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.222758 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.233428 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.259600 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:37 crc kubenswrapper[4870]: E0130 08:12:37.259918 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17e1099-eed8-4519-af45-260df6408a0b" containerName="controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.259943 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17e1099-eed8-4519-af45-260df6408a0b" containerName="controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: E0130 08:12:37.259972 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5914070f-d811-4c53-962e-62e819772201" containerName="route-controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.259982 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5914070f-d811-4c53-962e-62e819772201" containerName="route-controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.260116 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5914070f-d811-4c53-962e-62e819772201" containerName="route-controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.260135 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17e1099-eed8-4519-af45-260df6408a0b" containerName="controller-manager" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.260597 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.282725 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289564 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca\") pod \"a17e1099-eed8-4519-af45-260df6408a0b\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289646 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles\") pod \"a17e1099-eed8-4519-af45-260df6408a0b\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289674 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config\") pod \"5914070f-d811-4c53-962e-62e819772201\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289703 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276q6\" (UniqueName: \"kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6\") pod \"5914070f-d811-4c53-962e-62e819772201\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289732 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config\") pod \"a17e1099-eed8-4519-af45-260df6408a0b\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289794 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cnks\" (UniqueName: \"kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks\") pod \"a17e1099-eed8-4519-af45-260df6408a0b\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289813 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert\") pod \"a17e1099-eed8-4519-af45-260df6408a0b\" (UID: \"a17e1099-eed8-4519-af45-260df6408a0b\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289837 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca\") pod \"5914070f-d811-4c53-962e-62e819772201\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.289866 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert\") pod \"5914070f-d811-4c53-962e-62e819772201\" (UID: \"5914070f-d811-4c53-962e-62e819772201\") " Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.290581 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a17e1099-eed8-4519-af45-260df6408a0b" (UID: "a17e1099-eed8-4519-af45-260df6408a0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.291367 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config" (OuterVolumeSpecName: "config") pod "a17e1099-eed8-4519-af45-260df6408a0b" (UID: "a17e1099-eed8-4519-af45-260df6408a0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.291589 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca" (OuterVolumeSpecName: "client-ca") pod "5914070f-d811-4c53-962e-62e819772201" (UID: "5914070f-d811-4c53-962e-62e819772201"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.291763 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config" (OuterVolumeSpecName: "config") pod "5914070f-d811-4c53-962e-62e819772201" (UID: "5914070f-d811-4c53-962e-62e819772201"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.292176 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a17e1099-eed8-4519-af45-260df6408a0b" (UID: "a17e1099-eed8-4519-af45-260df6408a0b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.303049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a17e1099-eed8-4519-af45-260df6408a0b" (UID: "a17e1099-eed8-4519-af45-260df6408a0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.303218 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks" (OuterVolumeSpecName: "kube-api-access-2cnks") pod "a17e1099-eed8-4519-af45-260df6408a0b" (UID: "a17e1099-eed8-4519-af45-260df6408a0b"). InnerVolumeSpecName "kube-api-access-2cnks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.303285 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6" (OuterVolumeSpecName: "kube-api-access-276q6") pod "5914070f-d811-4c53-962e-62e819772201" (UID: "5914070f-d811-4c53-962e-62e819772201"). InnerVolumeSpecName "kube-api-access-276q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.304054 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5914070f-d811-4c53-962e-62e819772201" (UID: "5914070f-d811-4c53-962e-62e819772201"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.391833 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392004 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392045 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392106 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncp7p\" (UniqueName: \"kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392293 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392536 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cnks\" (UniqueName: \"kubernetes.io/projected/a17e1099-eed8-4519-af45-260df6408a0b-kube-api-access-2cnks\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392557 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a17e1099-eed8-4519-af45-260df6408a0b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392574 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392585 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5914070f-d811-4c53-962e-62e819772201-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392601 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392611 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392622 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5914070f-d811-4c53-962e-62e819772201-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392631 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276q6\" (UniqueName: \"kubernetes.io/projected/5914070f-d811-4c53-962e-62e819772201-kube-api-access-276q6\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.392643 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a17e1099-eed8-4519-af45-260df6408a0b-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.494236 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.495277 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.495359 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.495571 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncp7p\" (UniqueName: \"kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.495696 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.496808 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.497053 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.497120 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.501517 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.520855 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncp7p\" (UniqueName: \"kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p\") pod \"controller-manager-7c6648756f-h84rs\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.629354 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.834159 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerStarted","Data":"5086a69b5f8df7175222c9a53597ceeaa092692fca7dd2ea0dc59c15c50cec17"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.837226 4870 generic.go:334] "Generic (PLEG): container finished" podID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerID="d0dc443c5c9b20693d4448270af7993e64d959e03cda7b880c0de95b2ee5007b" exitCode=0 Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.837339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerDied","Data":"d0dc443c5c9b20693d4448270af7993e64d959e03cda7b880c0de95b2ee5007b"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.840185 4870 generic.go:334] "Generic (PLEG): container finished" podID="f5255b75-6d10-40f0-9d11-c975458382cb" containerID="72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36" exitCode=0 Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.840270 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerDied","Data":"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.845083 4870 generic.go:334] "Generic (PLEG): container finished" podID="5914070f-d811-4c53-962e-62e819772201" containerID="922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f" exitCode=0 Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.845196 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" event={"ID":"5914070f-d811-4c53-962e-62e819772201","Type":"ContainerDied","Data":"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.845207 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.845233 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn" event={"ID":"5914070f-d811-4c53-962e-62e819772201","Type":"ContainerDied","Data":"27e7288fbc9550c8c94128f62c3018e54a3ae3ca4153343d75341c7c9ed7ce95"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.845261 4870 scope.go:117] "RemoveContainer" containerID="922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.852179 4870 generic.go:334] "Generic (PLEG): container finished" podID="a17e1099-eed8-4519-af45-260df6408a0b" containerID="c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab" exitCode=0 Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.852377 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.852454 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" event={"ID":"a17e1099-eed8-4519-af45-260df6408a0b","Type":"ContainerDied","Data":"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.852538 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8596c5f99c-f2skc" event={"ID":"a17e1099-eed8-4519-af45-260df6408a0b","Type":"ContainerDied","Data":"c3a5d53e77564e84f7b91da983d0401bd24c456bb36029d2060aa1e9d103cd82"} Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.861062 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jqng8" podStartSLOduration=2.956286441 podStartE2EDuration="35.861020893s" podCreationTimestamp="2026-01-30 08:12:02 +0000 UTC" firstStartedPulling="2026-01-30 08:12:04.31392498 +0000 UTC m=+163.009472089" lastFinishedPulling="2026-01-30 08:12:37.218659432 +0000 UTC m=+195.914206541" observedRunningTime="2026-01-30 08:12:37.860104165 +0000 UTC m=+196.555651274" watchObservedRunningTime="2026-01-30 08:12:37.861020893 +0000 UTC m=+196.556568002" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.892154 4870 scope.go:117] "RemoveContainer" containerID="922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f" Jan 30 08:12:37 crc kubenswrapper[4870]: E0130 08:12:37.892777 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f\": container with ID starting with 922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f not found: ID does not exist" containerID="922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.892810 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f"} err="failed to get container status \"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f\": rpc error: code = NotFound desc = could not find container \"922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f\": container with ID starting with 922a1b15ac187a2b816c2cea61c9a2c801e0c9ab3229f0eacf89f9c59354543f not found: ID does not exist" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.892855 4870 scope.go:117] "RemoveContainer" containerID="c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.921045 4870 scope.go:117] "RemoveContainer" containerID="c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab" Jan 30 08:12:37 crc kubenswrapper[4870]: E0130 08:12:37.922282 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab\": container with ID starting with c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab not found: ID does not exist" containerID="c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.922375 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab"} err="failed to get container status \"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab\": rpc error: code = NotFound desc = could not find container \"c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab\": container with ID starting with c68b026edcf68806f3dc17b372ccdee94fa2aa4f5458d1613f492f6076a996ab not found: ID does not exist" Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.936256 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.943113 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8596c5f99c-f2skc"] Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.948647 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:37 crc kubenswrapper[4870]: I0130 08:12:37.953714 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6746697959-z2jtn"] Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.057957 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.083562 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5914070f-d811-4c53-962e-62e819772201" path="/var/lib/kubelet/pods/5914070f-d811-4c53-962e-62e819772201/volumes" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.084141 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a17e1099-eed8-4519-af45-260df6408a0b" path="/var/lib/kubelet/pods/a17e1099-eed8-4519-af45-260df6408a0b/volumes" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.862518 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" event={"ID":"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05","Type":"ContainerStarted","Data":"df06edc5cedd10f1cf6063d24bbddba4264e6ef76993b8981ce5385c15ccb756"} Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.863120 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" event={"ID":"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05","Type":"ContainerStarted","Data":"29687b205e4bfe58a29a1a38844f13e1a74004c67624566a0e52141eca348418"} Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.864580 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.868404 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerStarted","Data":"bbc787722aa3ad5d86b9358f4c93fb7295e3213baf6f9aa990b2876df2f315f2"} Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.871614 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerStarted","Data":"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69"} Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.872568 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.884785 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" podStartSLOduration=2.884758531 podStartE2EDuration="2.884758531s" podCreationTimestamp="2026-01-30 08:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:38.881049825 +0000 UTC m=+197.576596934" watchObservedRunningTime="2026-01-30 08:12:38.884758531 +0000 UTC m=+197.580305650" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.899176 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nxbdr" podStartSLOduration=2.981041084 podStartE2EDuration="34.899156432s" podCreationTimestamp="2026-01-30 08:12:04 +0000 UTC" firstStartedPulling="2026-01-30 08:12:06.421239209 +0000 UTC m=+165.116786318" lastFinishedPulling="2026-01-30 08:12:38.339354557 +0000 UTC m=+197.034901666" observedRunningTime="2026-01-30 08:12:38.898273254 +0000 UTC m=+197.593820373" watchObservedRunningTime="2026-01-30 08:12:38.899156432 +0000 UTC m=+197.594703541" Jan 30 08:12:38 crc kubenswrapper[4870]: I0130 08:12:38.914459 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-85lwg" podStartSLOduration=2.94960082 podStartE2EDuration="35.914439329s" podCreationTimestamp="2026-01-30 08:12:03 +0000 UTC" firstStartedPulling="2026-01-30 08:12:05.369357697 +0000 UTC m=+164.064904806" lastFinishedPulling="2026-01-30 08:12:38.334196206 +0000 UTC m=+197.029743315" observedRunningTime="2026-01-30 08:12:38.913605503 +0000 UTC m=+197.609152622" watchObservedRunningTime="2026-01-30 08:12:38.914439329 +0000 UTC m=+197.609986438" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.185082 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.185939 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.189229 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.189388 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.189530 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.189736 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.189946 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.191189 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.247738 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.247807 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.247835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.247859 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4pz\" (UniqueName: \"kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.300117 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.348634 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.348705 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4pz\" (UniqueName: \"kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.348774 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.348810 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.350060 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.350204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.366745 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.369762 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4pz\" (UniqueName: \"kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz\") pod \"route-controller-manager-9f547cf58-vv68h\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.507212 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:40 crc kubenswrapper[4870]: I0130 08:12:40.966105 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.619511 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.885276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" event={"ID":"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef","Type":"ContainerStarted","Data":"c5fa843ed751475d23b4cc7c99550ca1b426459de45a94b59e3341fcd829105d"} Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.885627 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" event={"ID":"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef","Type":"ContainerStarted","Data":"82eb7cd80398c1cdbf2869d9e797591e5a6a4357ecdd3aada431258c621248cb"} Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.886735 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.898175 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:41 crc kubenswrapper[4870]: I0130 08:12:41.912424 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" podStartSLOduration=5.91240329 podStartE2EDuration="5.91240329s" podCreationTimestamp="2026-01-30 08:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:41.908436646 +0000 UTC m=+200.603983755" watchObservedRunningTime="2026-01-30 08:12:41.91240329 +0000 UTC m=+200.607950399" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.014156 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.014998 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.017389 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.019159 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.038159 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.070040 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.070114 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.171411 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.171492 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.171567 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.191003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.332747 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.668261 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.668685 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.787203 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 08:12:42 crc kubenswrapper[4870]: W0130 08:12:42.793109 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podefc383a2_011d_40cd_95b9_5c1f97710135.slice/crio-2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2 WatchSource:0}: Error finding container 2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2: Status 404 returned error can't find the container with id 2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2 Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.847301 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.897623 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"efc383a2-011d-40cd-95b9-5c1f97710135","Type":"ContainerStarted","Data":"2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2"} Jan 30 08:12:42 crc kubenswrapper[4870]: I0130 08:12:42.956254 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:12:43 crc kubenswrapper[4870]: I0130 08:12:43.905129 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"efc383a2-011d-40cd-95b9-5c1f97710135","Type":"ContainerStarted","Data":"0c91a5504b707e3d6da8d033947fad87369d660994ee2008baa53a00db707413"} Jan 30 08:12:43 crc kubenswrapper[4870]: I0130 08:12:43.922580 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.922561057 podStartE2EDuration="1.922561057s" podCreationTimestamp="2026-01-30 08:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:43.921284968 +0000 UTC m=+202.616832077" watchObservedRunningTime="2026-01-30 08:12:43.922561057 +0000 UTC m=+202.618108176" Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.126112 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.126166 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.526345 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.526802 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.915293 4870 generic.go:334] "Generic (PLEG): container finished" podID="efc383a2-011d-40cd-95b9-5c1f97710135" containerID="0c91a5504b707e3d6da8d033947fad87369d660994ee2008baa53a00db707413" exitCode=0 Jan 30 08:12:44 crc kubenswrapper[4870]: I0130 08:12:44.916043 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"efc383a2-011d-40cd-95b9-5c1f97710135","Type":"ContainerDied","Data":"0c91a5504b707e3d6da8d033947fad87369d660994ee2008baa53a00db707413"} Jan 30 08:12:45 crc kubenswrapper[4870]: I0130 08:12:45.169686 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-85lwg" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="registry-server" probeResult="failure" output=< Jan 30 08:12:45 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:12:45 crc kubenswrapper[4870]: > Jan 30 08:12:45 crc kubenswrapper[4870]: I0130 08:12:45.580830 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nxbdr" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="registry-server" probeResult="failure" output=< Jan 30 08:12:45 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:12:45 crc kubenswrapper[4870]: > Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.248200 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.330306 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir\") pod \"efc383a2-011d-40cd-95b9-5c1f97710135\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.330440 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "efc383a2-011d-40cd-95b9-5c1f97710135" (UID: "efc383a2-011d-40cd-95b9-5c1f97710135"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.330465 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access\") pod \"efc383a2-011d-40cd-95b9-5c1f97710135\" (UID: \"efc383a2-011d-40cd-95b9-5c1f97710135\") " Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.330724 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efc383a2-011d-40cd-95b9-5c1f97710135-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.337508 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "efc383a2-011d-40cd-95b9-5c1f97710135" (UID: "efc383a2-011d-40cd-95b9-5c1f97710135"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.432209 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc383a2-011d-40cd-95b9-5c1f97710135-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.927287 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"efc383a2-011d-40cd-95b9-5c1f97710135","Type":"ContainerDied","Data":"2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2"} Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.927327 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c5c61c58732b27312a9b5633b48aec98bed1b8b0039f321de6adb67529a94a2" Jan 30 08:12:46 crc kubenswrapper[4870]: I0130 08:12:46.927396 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.413555 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 08:12:49 crc kubenswrapper[4870]: E0130 08:12:49.414045 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc383a2-011d-40cd-95b9-5c1f97710135" containerName="pruner" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.414056 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc383a2-011d-40cd-95b9-5c1f97710135" containerName="pruner" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.414205 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc383a2-011d-40cd-95b9-5c1f97710135" containerName="pruner" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.414593 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.416521 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.416782 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.425829 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.470059 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.470124 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.470177 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.571660 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.571768 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.571841 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.571857 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.571965 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.590323 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access\") pod \"installer-9-crc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:49 crc kubenswrapper[4870]: I0130 08:12:49.739905 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:12:50 crc kubenswrapper[4870]: I0130 08:12:50.138334 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 08:12:51 crc kubenswrapper[4870]: I0130 08:12:50.999565 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc","Type":"ContainerStarted","Data":"870ae255fc8aa69089480c5b4f44f2d48029e57db6c300a41e2ada010df31423"} Jan 30 08:12:51 crc kubenswrapper[4870]: I0130 08:12:51.000098 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc","Type":"ContainerStarted","Data":"7bf7aacf1a3cb5782a5f9385d5b6312bd1fa309375e7e58df111c48bf3bdf731"} Jan 30 08:12:51 crc kubenswrapper[4870]: I0130 08:12:51.024905 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.024884951 podStartE2EDuration="2.024884951s" podCreationTimestamp="2026-01-30 08:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:51.013605359 +0000 UTC m=+209.709152478" watchObservedRunningTime="2026-01-30 08:12:51.024884951 +0000 UTC m=+209.720432060" Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.023813 4870 generic.go:334] "Generic (PLEG): container finished" podID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerID="4e94e4129ecab37de0297dde4dc86e9ac30e8fda6a11f59af65a8c199b125d87" exitCode=0 Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.023903 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerDied","Data":"4e94e4129ecab37de0297dde4dc86e9ac30e8fda6a11f59af65a8c199b125d87"} Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.029342 4870 generic.go:334] "Generic (PLEG): container finished" podID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerID="4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4" exitCode=0 Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.029438 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerDied","Data":"4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4"} Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.035163 4870 generic.go:334] "Generic (PLEG): container finished" podID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerID="05623975ae7b461659818593078417aaae1d2eaf3b57481a7bebadeb7853e38b" exitCode=0 Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.035239 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerDied","Data":"05623975ae7b461659818593078417aaae1d2eaf3b57481a7bebadeb7853e38b"} Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.039209 4870 generic.go:334] "Generic (PLEG): container finished" podID="abc41080-75c5-421f-baa8-f05792f74564" containerID="a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028" exitCode=0 Jan 30 08:12:52 crc kubenswrapper[4870]: I0130 08:12:52.040041 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerDied","Data":"a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.048619 4870 generic.go:334] "Generic (PLEG): container finished" podID="258d3e35-5580-4108-889c-9d5d2f80c810" containerID="31bdc406d04a8518a48f85291f438714500a3199ef4565a4e1bcc218ea393cac" exitCode=0 Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.048709 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerDied","Data":"31bdc406d04a8518a48f85291f438714500a3199ef4565a4e1bcc218ea393cac"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.053973 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerStarted","Data":"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.057189 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerStarted","Data":"0ab86db3891506c1e61fe197330cdb4401ed14cf6adc50c8af9e3081bf1ba9b5"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.062988 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerStarted","Data":"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.070898 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerStarted","Data":"9c925d71b4dfdc55925a74993dfa3447a8c069656a12a2daf0cbfcade11ab1ed"} Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.113738 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sdlrf" podStartSLOduration=2.441989311 podStartE2EDuration="53.1137093s" podCreationTimestamp="2026-01-30 08:12:00 +0000 UTC" firstStartedPulling="2026-01-30 08:12:02.143477482 +0000 UTC m=+160.839024591" lastFinishedPulling="2026-01-30 08:12:52.815197471 +0000 UTC m=+211.510744580" observedRunningTime="2026-01-30 08:12:53.104204913 +0000 UTC m=+211.799752022" watchObservedRunningTime="2026-01-30 08:12:53.1137093 +0000 UTC m=+211.809256409" Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.132783 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ddg46" podStartSLOduration=3.010321136 podStartE2EDuration="51.132768776s" podCreationTimestamp="2026-01-30 08:12:02 +0000 UTC" firstStartedPulling="2026-01-30 08:12:04.31424424 +0000 UTC m=+163.009791349" lastFinishedPulling="2026-01-30 08:12:52.43669187 +0000 UTC m=+211.132238989" observedRunningTime="2026-01-30 08:12:53.128916826 +0000 UTC m=+211.824463935" watchObservedRunningTime="2026-01-30 08:12:53.132768776 +0000 UTC m=+211.828315875" Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.152485 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vm685" podStartSLOduration=2.813658786 podStartE2EDuration="53.152469972s" podCreationTimestamp="2026-01-30 08:12:00 +0000 UTC" firstStartedPulling="2026-01-30 08:12:02.150362635 +0000 UTC m=+160.845909734" lastFinishedPulling="2026-01-30 08:12:52.489173811 +0000 UTC m=+211.184720920" observedRunningTime="2026-01-30 08:12:53.149442368 +0000 UTC m=+211.844989477" watchObservedRunningTime="2026-01-30 08:12:53.152469972 +0000 UTC m=+211.848017071" Jan 30 08:12:53 crc kubenswrapper[4870]: I0130 08:12:53.171721 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rk4lj" podStartSLOduration=2.830874149 podStartE2EDuration="53.171703634s" podCreationTimestamp="2026-01-30 08:12:00 +0000 UTC" firstStartedPulling="2026-01-30 08:12:02.153527678 +0000 UTC m=+160.849074797" lastFinishedPulling="2026-01-30 08:12:52.494357163 +0000 UTC m=+211.189904282" observedRunningTime="2026-01-30 08:12:53.170348082 +0000 UTC m=+211.865895191" watchObservedRunningTime="2026-01-30 08:12:53.171703634 +0000 UTC m=+211.867250743" Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.080583 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerStarted","Data":"5d7f1da1a59f0ee841deb45c6681be28192aa5d1b0765a1de1cc229f89986ccd"} Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.196694 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.229997 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cx2x5" podStartSLOduration=2.7343042 podStartE2EDuration="54.229955331s" podCreationTimestamp="2026-01-30 08:12:00 +0000 UTC" firstStartedPulling="2026-01-30 08:12:02.163299296 +0000 UTC m=+160.858846395" lastFinishedPulling="2026-01-30 08:12:53.658950417 +0000 UTC m=+212.354497526" observedRunningTime="2026-01-30 08:12:54.109847674 +0000 UTC m=+212.805394783" watchObservedRunningTime="2026-01-30 08:12:54.229955331 +0000 UTC m=+212.925502440" Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.251101 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.587783 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:54 crc kubenswrapper[4870]: I0130 08:12:54.640154 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:55 crc kubenswrapper[4870]: I0130 08:12:55.249746 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:12:55 crc kubenswrapper[4870]: I0130 08:12:55.249818 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:12:55 crc kubenswrapper[4870]: I0130 08:12:55.249895 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:12:55 crc kubenswrapper[4870]: I0130 08:12:55.250666 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:12:55 crc kubenswrapper[4870]: I0130 08:12:55.250744 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7" gracePeriod=600 Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.064964 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.065575 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" podUID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" containerName="controller-manager" containerID="cri-o://df06edc5cedd10f1cf6063d24bbddba4264e6ef76993b8981ce5385c15ccb756" gracePeriod=30 Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.088947 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.089178 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" podUID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" containerName="route-controller-manager" containerID="cri-o://c5fa843ed751475d23b4cc7c99550ca1b426459de45a94b59e3341fcd829105d" gracePeriod=30 Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.107283 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7" exitCode=0 Jan 30 08:12:56 crc kubenswrapper[4870]: I0130 08:12:56.107342 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7"} Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.115256 4870 generic.go:334] "Generic (PLEG): container finished" podID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" containerID="df06edc5cedd10f1cf6063d24bbddba4264e6ef76993b8981ce5385c15ccb756" exitCode=0 Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.115368 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" event={"ID":"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05","Type":"ContainerDied","Data":"df06edc5cedd10f1cf6063d24bbddba4264e6ef76993b8981ce5385c15ccb756"} Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.118815 4870 generic.go:334] "Generic (PLEG): container finished" podID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" containerID="c5fa843ed751475d23b4cc7c99550ca1b426459de45a94b59e3341fcd829105d" exitCode=0 Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.118905 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" event={"ID":"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef","Type":"ContainerDied","Data":"c5fa843ed751475d23b4cc7c99550ca1b426459de45a94b59e3341fcd829105d"} Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.123815 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12"} Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.415968 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.461749 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:12:57 crc kubenswrapper[4870]: E0130 08:12:57.462015 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" containerName="route-controller-manager" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.462036 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" containerName="route-controller-manager" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.462185 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" containerName="route-controller-manager" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.463995 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.475378 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.489826 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert\") pod \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.489867 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config\") pod \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.489956 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4pz\" (UniqueName: \"kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz\") pod \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.490031 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca\") pod \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\" (UID: \"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.490781 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" (UID: "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.490803 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config" (OuterVolumeSpecName: "config") pod "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" (UID: "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.496795 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" (UID: "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.496842 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz" (OuterVolumeSpecName: "kube-api-access-jh4pz") pod "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" (UID: "9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef"). InnerVolumeSpecName "kube-api-access-jh4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.498646 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.591525 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncp7p\" (UniqueName: \"kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p\") pod \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.591689 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert\") pod \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592156 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles\") pod \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592204 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config\") pod \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592247 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca\") pod \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\" (UID: \"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05\") " Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvtn\" (UniqueName: \"kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592456 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592518 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592545 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592828 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4pz\" (UniqueName: \"kubernetes.io/projected/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-kube-api-access-jh4pz\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592860 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592923 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.592938 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.593069 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" (UID: "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.593124 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca" (OuterVolumeSpecName: "client-ca") pod "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" (UID: "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.593364 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config" (OuterVolumeSpecName: "config") pod "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" (UID: "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.594927 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" (UID: "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.595154 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p" (OuterVolumeSpecName: "kube-api-access-ncp7p") pod "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" (UID: "2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05"). InnerVolumeSpecName "kube-api-access-ncp7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694050 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvtn\" (UniqueName: \"kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694128 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694219 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694275 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694417 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694448 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694480 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694506 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.694531 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncp7p\" (UniqueName: \"kubernetes.io/projected/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05-kube-api-access-ncp7p\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.695443 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.695975 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.700863 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.712938 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvtn\" (UniqueName: \"kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn\") pod \"route-controller-manager-658c669f4d-v69zt\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:57 crc kubenswrapper[4870]: I0130 08:12:57.833094 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.138485 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" event={"ID":"9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef","Type":"ContainerDied","Data":"82eb7cd80398c1cdbf2869d9e797591e5a6a4357ecdd3aada431258c621248cb"} Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.138884 4870 scope.go:117] "RemoveContainer" containerID="c5fa843ed751475d23b4cc7c99550ca1b426459de45a94b59e3341fcd829105d" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.138657 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.142159 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" event={"ID":"2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05","Type":"ContainerDied","Data":"29687b205e4bfe58a29a1a38844f13e1a74004c67624566a0e52141eca348418"} Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.142189 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c6648756f-h84rs" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.167041 4870 scope.go:117] "RemoveContainer" containerID="df06edc5cedd10f1cf6063d24bbddba4264e6ef76993b8981ce5385c15ccb756" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.171101 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.181831 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f547cf58-vv68h"] Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.190654 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.193699 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c6648756f-h84rs"] Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.242228 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:12:58 crc kubenswrapper[4870]: W0130 08:12:58.250060 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2401867d_7869_4633_aeeb_bfb3653c2786.slice/crio-a4cce0bf268510416f33b5ed9e80efbf966fec2a36ba47966d7bf118bdd855e6 WatchSource:0}: Error finding container a4cce0bf268510416f33b5ed9e80efbf966fec2a36ba47966d7bf118bdd855e6: Status 404 returned error can't find the container with id a4cce0bf268510416f33b5ed9e80efbf966fec2a36ba47966d7bf118bdd855e6 Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.312683 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.312927 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nxbdr" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="registry-server" containerID="cri-o://c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69" gracePeriod=2 Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.713347 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.809740 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities\") pod \"f5255b75-6d10-40f0-9d11-c975458382cb\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.809902 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmxfx\" (UniqueName: \"kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx\") pod \"f5255b75-6d10-40f0-9d11-c975458382cb\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.809975 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content\") pod \"f5255b75-6d10-40f0-9d11-c975458382cb\" (UID: \"f5255b75-6d10-40f0-9d11-c975458382cb\") " Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.811665 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities" (OuterVolumeSpecName: "utilities") pod "f5255b75-6d10-40f0-9d11-c975458382cb" (UID: "f5255b75-6d10-40f0-9d11-c975458382cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.817976 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx" (OuterVolumeSpecName: "kube-api-access-hmxfx") pod "f5255b75-6d10-40f0-9d11-c975458382cb" (UID: "f5255b75-6d10-40f0-9d11-c975458382cb"). InnerVolumeSpecName "kube-api-access-hmxfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.911731 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmxfx\" (UniqueName: \"kubernetes.io/projected/f5255b75-6d10-40f0-9d11-c975458382cb-kube-api-access-hmxfx\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.911774 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:58 crc kubenswrapper[4870]: I0130 08:12:58.956269 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5255b75-6d10-40f0-9d11-c975458382cb" (UID: "f5255b75-6d10-40f0-9d11-c975458382cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.013727 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5255b75-6d10-40f0-9d11-c975458382cb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.150383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" event={"ID":"2401867d-7869-4633-aeeb-bfb3653c2786","Type":"ContainerStarted","Data":"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9"} Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.150429 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" event={"ID":"2401867d-7869-4633-aeeb-bfb3653c2786","Type":"ContainerStarted","Data":"a4cce0bf268510416f33b5ed9e80efbf966fec2a36ba47966d7bf118bdd855e6"} Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.150698 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.158741 4870 generic.go:334] "Generic (PLEG): container finished" podID="f5255b75-6d10-40f0-9d11-c975458382cb" containerID="c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69" exitCode=0 Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.158776 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerDied","Data":"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69"} Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.158795 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxbdr" event={"ID":"f5255b75-6d10-40f0-9d11-c975458382cb","Type":"ContainerDied","Data":"75dc4d4ca08c96b6af316cef86b49419d2a6ad7374d685b482b8ff2fed0aeb65"} Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.158853 4870 scope.go:117] "RemoveContainer" containerID="c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.158973 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxbdr" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.161541 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.209661 4870 scope.go:117] "RemoveContainer" containerID="72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.219668 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" podStartSLOduration=3.2196494429999998 podStartE2EDuration="3.219649443s" podCreationTimestamp="2026-01-30 08:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:12:59.215593416 +0000 UTC m=+217.911140535" watchObservedRunningTime="2026-01-30 08:12:59.219649443 +0000 UTC m=+217.915196572" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.263693 4870 scope.go:117] "RemoveContainer" containerID="91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.282005 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.300846 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nxbdr"] Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.310218 4870 scope.go:117] "RemoveContainer" containerID="c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69" Jan 30 08:12:59 crc kubenswrapper[4870]: E0130 08:12:59.310826 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69\": container with ID starting with c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69 not found: ID does not exist" containerID="c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.310924 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69"} err="failed to get container status \"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69\": rpc error: code = NotFound desc = could not find container \"c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69\": container with ID starting with c8d326de4781de23a826a5d5681c703c144d7d526b6723de04af2ff52d6c5c69 not found: ID does not exist" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.310958 4870 scope.go:117] "RemoveContainer" containerID="72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36" Jan 30 08:12:59 crc kubenswrapper[4870]: E0130 08:12:59.311397 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36\": container with ID starting with 72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36 not found: ID does not exist" containerID="72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.311455 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36"} err="failed to get container status \"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36\": rpc error: code = NotFound desc = could not find container \"72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36\": container with ID starting with 72d65fd678209cea5f65f67e6fba57d68354274d2c312d3a2b34858c8f8e5d36 not found: ID does not exist" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.311491 4870 scope.go:117] "RemoveContainer" containerID="91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381" Jan 30 08:12:59 crc kubenswrapper[4870]: E0130 08:12:59.311800 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381\": container with ID starting with 91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381 not found: ID does not exist" containerID="91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381" Jan 30 08:12:59 crc kubenswrapper[4870]: I0130 08:12:59.311840 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381"} err="failed to get container status \"91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381\": rpc error: code = NotFound desc = could not find container \"91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381\": container with ID starting with 91c2f94ca5ba24911752768fbf8405337095dc7b2ce3321b5d0889c6d9e3f381 not found: ID does not exist" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.083329 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" path="/var/lib/kubelet/pods/2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05/volumes" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.084452 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef" path="/var/lib/kubelet/pods/9e4a05ab-16ef-4d68-9e7e-a9064ce0a4ef/volumes" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.085158 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" path="/var/lib/kubelet/pods/f5255b75-6d10-40f0-9d11-c975458382cb/volumes" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.190667 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:00 crc kubenswrapper[4870]: E0130 08:13:00.191014 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="extract-content" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191034 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="extract-content" Jan 30 08:13:00 crc kubenswrapper[4870]: E0130 08:13:00.191048 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="registry-server" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191056 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="registry-server" Jan 30 08:13:00 crc kubenswrapper[4870]: E0130 08:13:00.191074 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="extract-utilities" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191108 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="extract-utilities" Jan 30 08:13:00 crc kubenswrapper[4870]: E0130 08:13:00.191123 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" containerName="controller-manager" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191131 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" containerName="controller-manager" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191255 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb2cfd8-50f7-49c7-a7e8-45e7675e9f05" containerName="controller-manager" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191270 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5255b75-6d10-40f0-9d11-c975458382cb" containerName="registry-server" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.191749 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.193960 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.195531 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.196079 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.197076 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.197154 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.197191 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.201366 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.206423 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.234526 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.234598 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.234675 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2llk\" (UniqueName: \"kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.234741 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.234816 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.336137 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.336216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.336273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2llk\" (UniqueName: \"kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.336314 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.336373 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.338041 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.339205 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.339686 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.342083 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.358110 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2llk\" (UniqueName: \"kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk\") pod \"controller-manager-5c6cdccc5f-szscj\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.523484 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.686415 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.686761 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:13:00 crc kubenswrapper[4870]: I0130 08:13:00.754491 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.015095 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:01 crc kubenswrapper[4870]: W0130 08:13:01.027393 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0b32bd5_0420_437c_abe3_b568b5fced25.slice/crio-9ddf8fa07886379822b5d7e837b006d9d1c35dacedfc83c71d1a7c67e61b89cf WatchSource:0}: Error finding container 9ddf8fa07886379822b5d7e837b006d9d1c35dacedfc83c71d1a7c67e61b89cf: Status 404 returned error can't find the container with id 9ddf8fa07886379822b5d7e837b006d9d1c35dacedfc83c71d1a7c67e61b89cf Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.054331 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.054385 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.101663 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.143293 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.143340 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.181851 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" event={"ID":"f0b32bd5-0420-437c-abe3-b568b5fced25","Type":"ContainerStarted","Data":"9ddf8fa07886379822b5d7e837b006d9d1c35dacedfc83c71d1a7c67e61b89cf"} Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.204032 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.237986 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.241322 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.253921 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.285976 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.286037 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:01 crc kubenswrapper[4870]: I0130 08:13:01.355255 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:02 crc kubenswrapper[4870]: I0130 08:13:02.189763 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" event={"ID":"f0b32bd5-0420-437c-abe3-b568b5fced25","Type":"ContainerStarted","Data":"e0c909e10ad146c5f25ddc4c306ee746d3ae6daa59a6ed6f36b3bed06976cc14"} Jan 30 08:13:02 crc kubenswrapper[4870]: I0130 08:13:02.213415 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" podStartSLOduration=6.213399202 podStartE2EDuration="6.213399202s" podCreationTimestamp="2026-01-30 08:12:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:13:02.211371748 +0000 UTC m=+220.906918857" watchObservedRunningTime="2026-01-30 08:13:02.213399202 +0000 UTC m=+220.908946311" Jan 30 08:13:02 crc kubenswrapper[4870]: I0130 08:13:02.248264 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.048570 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.048629 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.093580 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.131446 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.198205 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.199012 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sdlrf" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="registry-server" containerID="cri-o://0ab86db3891506c1e61fe197330cdb4401ed14cf6adc50c8af9e3081bf1ba9b5" gracePeriod=2 Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.208745 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:03 crc kubenswrapper[4870]: I0130 08:13:03.270809 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.113385 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.217849 4870 generic.go:334] "Generic (PLEG): container finished" podID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerID="0ab86db3891506c1e61fe197330cdb4401ed14cf6adc50c8af9e3081bf1ba9b5" exitCode=0 Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.219403 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerDied","Data":"0ab86db3891506c1e61fe197330cdb4401ed14cf6adc50c8af9e3081bf1ba9b5"} Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.219598 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vm685" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="registry-server" containerID="cri-o://fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344" gracePeriod=2 Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.361676 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.402535 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content\") pod \"e02d35f8-2e8c-47a3-87c9-9580ab766290\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.402590 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities\") pod \"e02d35f8-2e8c-47a3-87c9-9580ab766290\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.402614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wlfz\" (UniqueName: \"kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz\") pod \"e02d35f8-2e8c-47a3-87c9-9580ab766290\" (UID: \"e02d35f8-2e8c-47a3-87c9-9580ab766290\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.404303 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities" (OuterVolumeSpecName: "utilities") pod "e02d35f8-2e8c-47a3-87c9-9580ab766290" (UID: "e02d35f8-2e8c-47a3-87c9-9580ab766290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.426674 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz" (OuterVolumeSpecName: "kube-api-access-6wlfz") pod "e02d35f8-2e8c-47a3-87c9-9580ab766290" (UID: "e02d35f8-2e8c-47a3-87c9-9580ab766290"). InnerVolumeSpecName "kube-api-access-6wlfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.465397 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e02d35f8-2e8c-47a3-87c9-9580ab766290" (UID: "e02d35f8-2e8c-47a3-87c9-9580ab766290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.506975 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.507041 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wlfz\" (UniqueName: \"kubernetes.io/projected/e02d35f8-2e8c-47a3-87c9-9580ab766290-kube-api-access-6wlfz\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.507145 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e02d35f8-2e8c-47a3-87c9-9580ab766290-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.673312 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.709301 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content\") pod \"abc41080-75c5-421f-baa8-f05792f74564\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.709400 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhq7x\" (UniqueName: \"kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x\") pod \"abc41080-75c5-421f-baa8-f05792f74564\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.709502 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities\") pod \"abc41080-75c5-421f-baa8-f05792f74564\" (UID: \"abc41080-75c5-421f-baa8-f05792f74564\") " Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.710415 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities" (OuterVolumeSpecName: "utilities") pod "abc41080-75c5-421f-baa8-f05792f74564" (UID: "abc41080-75c5-421f-baa8-f05792f74564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.712887 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x" (OuterVolumeSpecName: "kube-api-access-lhq7x") pod "abc41080-75c5-421f-baa8-f05792f74564" (UID: "abc41080-75c5-421f-baa8-f05792f74564"). InnerVolumeSpecName "kube-api-access-lhq7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.767124 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abc41080-75c5-421f-baa8-f05792f74564" (UID: "abc41080-75c5-421f-baa8-f05792f74564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.812248 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.812313 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abc41080-75c5-421f-baa8-f05792f74564-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:04 crc kubenswrapper[4870]: I0130 08:13:04.812341 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhq7x\" (UniqueName: \"kubernetes.io/projected/abc41080-75c5-421f-baa8-f05792f74564-kube-api-access-lhq7x\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.225558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlrf" event={"ID":"e02d35f8-2e8c-47a3-87c9-9580ab766290","Type":"ContainerDied","Data":"17d6a9bdca6c16fe2977a640455c60bcc06dd2ad4ecdc2b9c6411506d215c0be"} Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.226032 4870 scope.go:117] "RemoveContainer" containerID="0ab86db3891506c1e61fe197330cdb4401ed14cf6adc50c8af9e3081bf1ba9b5" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.225634 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlrf" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.229947 4870 generic.go:334] "Generic (PLEG): container finished" podID="abc41080-75c5-421f-baa8-f05792f74564" containerID="fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344" exitCode=0 Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.230042 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm685" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.230039 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerDied","Data":"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344"} Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.230096 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm685" event={"ID":"abc41080-75c5-421f-baa8-f05792f74564","Type":"ContainerDied","Data":"96518355e0bd9b243a322652ed93adea62f75e712bc08772e1f193f3dde1d1a9"} Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.254268 4870 scope.go:117] "RemoveContainer" containerID="05623975ae7b461659818593078417aaae1d2eaf3b57481a7bebadeb7853e38b" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.284349 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.289744 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sdlrf"] Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.299450 4870 scope.go:117] "RemoveContainer" containerID="1ecf5db22e2b1fa8547549ce582ecddde377bbe670b1f97e03e9e9e6f42d4dae" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.325955 4870 scope.go:117] "RemoveContainer" containerID="fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.330224 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.331852 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vm685"] Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.353546 4870 scope.go:117] "RemoveContainer" containerID="a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.368374 4870 scope.go:117] "RemoveContainer" containerID="c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.385620 4870 scope.go:117] "RemoveContainer" containerID="fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344" Jan 30 08:13:05 crc kubenswrapper[4870]: E0130 08:13:05.386271 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344\": container with ID starting with fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344 not found: ID does not exist" containerID="fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.386303 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344"} err="failed to get container status \"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344\": rpc error: code = NotFound desc = could not find container \"fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344\": container with ID starting with fd4c89b8cbd81ba1ad1c068e8582d00bddadd898a4dcfa9e47b3ae08c6dd6344 not found: ID does not exist" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.386324 4870 scope.go:117] "RemoveContainer" containerID="a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028" Jan 30 08:13:05 crc kubenswrapper[4870]: E0130 08:13:05.386886 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028\": container with ID starting with a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028 not found: ID does not exist" containerID="a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.386927 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028"} err="failed to get container status \"a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028\": rpc error: code = NotFound desc = could not find container \"a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028\": container with ID starting with a50ef1faa7c24f23fbc57c47c06d93445c8919db7d381cb38dc62d4428844028 not found: ID does not exist" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.386958 4870 scope.go:117] "RemoveContainer" containerID="c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354" Jan 30 08:13:05 crc kubenswrapper[4870]: E0130 08:13:05.387499 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354\": container with ID starting with c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354 not found: ID does not exist" containerID="c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.387528 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354"} err="failed to get container status \"c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354\": rpc error: code = NotFound desc = could not find container \"c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354\": container with ID starting with c5a44e3995c68d711cf7ffd635d1bf773b8cff7c52ede4919575b0156f885354 not found: ID does not exist" Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.513945 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:13:05 crc kubenswrapper[4870]: I0130 08:13:05.514195 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ddg46" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="registry-server" containerID="cri-o://7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70" gracePeriod=2 Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.005114 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.085796 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc41080-75c5-421f-baa8-f05792f74564" path="/var/lib/kubelet/pods/abc41080-75c5-421f-baa8-f05792f74564/volumes" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.087316 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" path="/var/lib/kubelet/pods/e02d35f8-2e8c-47a3-87c9-9580ab766290/volumes" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.137295 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sz97\" (UniqueName: \"kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97\") pod \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.137841 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content\") pod \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.137974 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities\") pod \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\" (UID: \"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2\") " Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.138948 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities" (OuterVolumeSpecName: "utilities") pod "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" (UID: "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.144604 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97" (OuterVolumeSpecName: "kube-api-access-4sz97") pod "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" (UID: "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2"). InnerVolumeSpecName "kube-api-access-4sz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.168545 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" (UID: "025ee8c8-8a97-4158-88fb-c4fa23f5c9c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.240479 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.240549 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sz97\" (UniqueName: \"kubernetes.io/projected/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-kube-api-access-4sz97\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.240574 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.243568 4870 generic.go:334] "Generic (PLEG): container finished" podID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerID="7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70" exitCode=0 Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.243642 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerDied","Data":"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70"} Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.243680 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddg46" event={"ID":"025ee8c8-8a97-4158-88fb-c4fa23f5c9c2","Type":"ContainerDied","Data":"a0bdc36a8576d5c25a0097622d42f72393c74577381da880313d27ca87e33cc7"} Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.243700 4870 scope.go:117] "RemoveContainer" containerID="7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.243955 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddg46" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.277583 4870 scope.go:117] "RemoveContainer" containerID="4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.285855 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.293865 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddg46"] Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.307838 4870 scope.go:117] "RemoveContainer" containerID="3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.329439 4870 scope.go:117] "RemoveContainer" containerID="7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70" Jan 30 08:13:06 crc kubenswrapper[4870]: E0130 08:13:06.329945 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70\": container with ID starting with 7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70 not found: ID does not exist" containerID="7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.329989 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70"} err="failed to get container status \"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70\": rpc error: code = NotFound desc = could not find container \"7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70\": container with ID starting with 7b76f09aa9f9994ca61e250ea9fcc87f7cc8e5437b6738688da0a44244a55f70 not found: ID does not exist" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.330019 4870 scope.go:117] "RemoveContainer" containerID="4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4" Jan 30 08:13:06 crc kubenswrapper[4870]: E0130 08:13:06.330548 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4\": container with ID starting with 4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4 not found: ID does not exist" containerID="4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.330622 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4"} err="failed to get container status \"4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4\": rpc error: code = NotFound desc = could not find container \"4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4\": container with ID starting with 4de26b3246c8808449e770399c4f7fa4183c010b1afab00b9d0e527740a08bc4 not found: ID does not exist" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.330673 4870 scope.go:117] "RemoveContainer" containerID="3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c" Jan 30 08:13:06 crc kubenswrapper[4870]: E0130 08:13:06.331055 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c\": container with ID starting with 3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c not found: ID does not exist" containerID="3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.331087 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c"} err="failed to get container status \"3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c\": rpc error: code = NotFound desc = could not find container \"3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c\": container with ID starting with 3038fe3301b881e13deef165b3387577b2b42b2fdbf788c35bdc741aa4ed718c not found: ID does not exist" Jan 30 08:13:06 crc kubenswrapper[4870]: I0130 08:13:06.652705 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerName="oauth-openshift" containerID="cri-o://102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789" gracePeriod=15 Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.197687 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.256828 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.256900 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.256926 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.256965 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.256988 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.257020 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.257267 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258221 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258274 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258323 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dp8v\" (UniqueName: \"kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258362 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258393 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258622 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258646 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258641 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258677 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies\") pod \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\" (UID: \"d4876c72-6cd1-43e0-b44a-45c4bd69e91f\") " Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258905 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.258919 4870 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.259365 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.259834 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.260465 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.262696 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.263105 4870 generic.go:334] "Generic (PLEG): container finished" podID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerID="102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789" exitCode=0 Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.263241 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" event={"ID":"d4876c72-6cd1-43e0-b44a-45c4bd69e91f","Type":"ContainerDied","Data":"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789"} Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.263282 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" event={"ID":"d4876c72-6cd1-43e0-b44a-45c4bd69e91f","Type":"ContainerDied","Data":"94d52f9687de877d5fd97b94963947e16acbe6d1f11849a8cb9317ae4e717ce7"} Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.263303 4870 scope.go:117] "RemoveContainer" containerID="102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.263425 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xxrkx" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.265175 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.267157 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.270068 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.270343 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.271342 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.272041 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.272449 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v" (OuterVolumeSpecName: "kube-api-access-4dp8v") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "kube-api-access-4dp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.274469 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d4876c72-6cd1-43e0-b44a-45c4bd69e91f" (UID: "d4876c72-6cd1-43e0-b44a-45c4bd69e91f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.302159 4870 scope.go:117] "RemoveContainer" containerID="102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789" Jan 30 08:13:07 crc kubenswrapper[4870]: E0130 08:13:07.302745 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789\": container with ID starting with 102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789 not found: ID does not exist" containerID="102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.302788 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789"} err="failed to get container status \"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789\": rpc error: code = NotFound desc = could not find container \"102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789\": container with ID starting with 102b04420a2d27e2c42980d96dcb76309f8b40b50075cb8c01ca29cf54abe789 not found: ID does not exist" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360508 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360541 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360558 4870 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360571 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360582 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360590 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360600 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360611 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360623 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360635 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dp8v\" (UniqueName: \"kubernetes.io/projected/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-kube-api-access-4dp8v\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360650 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.360663 4870 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4876c72-6cd1-43e0-b44a-45c4bd69e91f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.605276 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:13:07 crc kubenswrapper[4870]: I0130 08:13:07.609029 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xxrkx"] Jan 30 08:13:08 crc kubenswrapper[4870]: I0130 08:13:08.082853 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" path="/var/lib/kubelet/pods/025ee8c8-8a97-4158-88fb-c4fa23f5c9c2/volumes" Jan 30 08:13:08 crc kubenswrapper[4870]: I0130 08:13:08.084262 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" path="/var/lib/kubelet/pods/d4876c72-6cd1-43e0-b44a-45c4bd69e91f/volumes" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.206579 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-dc8679f5f-mdxn5"] Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.206924 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.206944 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.206961 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.206971 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.206987 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.206998 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207012 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerName="oauth-openshift" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207023 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerName="oauth-openshift" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207043 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207053 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207071 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207080 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207089 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207097 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207115 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207123 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207154 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207162 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="extract-utilities" Jan 30 08:13:10 crc kubenswrapper[4870]: E0130 08:13:10.207172 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207181 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="extract-content" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207316 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4876c72-6cd1-43e0-b44a-45c4bd69e91f" containerName="oauth-openshift" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207331 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="025ee8c8-8a97-4158-88fb-c4fa23f5c9c2" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207341 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02d35f8-2e8c-47a3-87c9-9580ab766290" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207353 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc41080-75c5-421f-baa8-f05792f74564" containerName="registry-server" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.207913 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.214639 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.214701 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.215091 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.217459 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.217643 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.218544 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.218934 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.219107 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.219198 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.220089 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.220610 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.222124 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.230254 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.234561 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-dc8679f5f-mdxn5"] Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.236916 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.241603 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-session\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304560 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304601 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304646 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304679 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304717 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304749 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-service-ca\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.304963 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-router-certs\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305067 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnz92\" (UniqueName: \"kubernetes.io/projected/5663080a-bd5b-4cfd-84be-13421571ce8a-kube-api-access-xnz92\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305110 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-dir\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305190 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-error\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305475 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-policies\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.305540 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-login\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.407653 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.407765 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.407822 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.407867 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-service-ca\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.407956 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-router-certs\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408049 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnz92\" (UniqueName: \"kubernetes.io/projected/5663080a-bd5b-4cfd-84be-13421571ce8a-kube-api-access-xnz92\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408087 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-dir\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408125 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-error\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408224 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-policies\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408261 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-login\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408299 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-session\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408351 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.408391 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.410243 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.411194 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-policies\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.410312 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-service-ca\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.412011 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.411135 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5663080a-bd5b-4cfd-84be-13421571ce8a-audit-dir\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.414665 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.414689 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-error\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.414702 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.416521 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.416751 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-session\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.417518 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.417926 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-user-template-login\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.418214 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5663080a-bd5b-4cfd-84be-13421571ce8a-v4-0-config-system-router-certs\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.435818 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnz92\" (UniqueName: \"kubernetes.io/projected/5663080a-bd5b-4cfd-84be-13421571ce8a-kube-api-access-xnz92\") pod \"oauth-openshift-dc8679f5f-mdxn5\" (UID: \"5663080a-bd5b-4cfd-84be-13421571ce8a\") " pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:10 crc kubenswrapper[4870]: I0130 08:13:10.553491 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:11 crc kubenswrapper[4870]: I0130 08:13:11.009691 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-dc8679f5f-mdxn5"] Jan 30 08:13:11 crc kubenswrapper[4870]: W0130 08:13:11.025774 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5663080a_bd5b_4cfd_84be_13421571ce8a.slice/crio-2b6e8c53aef34696d37ddc955b0f109b7ba6d75d043ea54fa91476cc734a3687 WatchSource:0}: Error finding container 2b6e8c53aef34696d37ddc955b0f109b7ba6d75d043ea54fa91476cc734a3687: Status 404 returned error can't find the container with id 2b6e8c53aef34696d37ddc955b0f109b7ba6d75d043ea54fa91476cc734a3687 Jan 30 08:13:11 crc kubenswrapper[4870]: I0130 08:13:11.299406 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" event={"ID":"5663080a-bd5b-4cfd-84be-13421571ce8a","Type":"ContainerStarted","Data":"2b6e8c53aef34696d37ddc955b0f109b7ba6d75d043ea54fa91476cc734a3687"} Jan 30 08:13:12 crc kubenswrapper[4870]: I0130 08:13:12.307356 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" event={"ID":"5663080a-bd5b-4cfd-84be-13421571ce8a","Type":"ContainerStarted","Data":"0cf2ca9535702238062d0f3eec5b2232879212588fd08767b46dd1c00eefa89b"} Jan 30 08:13:12 crc kubenswrapper[4870]: I0130 08:13:12.307814 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:12 crc kubenswrapper[4870]: I0130 08:13:12.313357 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" Jan 30 08:13:12 crc kubenswrapper[4870]: I0130 08:13:12.335209 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-dc8679f5f-mdxn5" podStartSLOduration=31.33519352 podStartE2EDuration="31.33519352s" podCreationTimestamp="2026-01-30 08:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:13:12.331870076 +0000 UTC m=+231.027417195" watchObservedRunningTime="2026-01-30 08:13:12.33519352 +0000 UTC m=+231.030740629" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.119667 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.120929 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" podUID="f0b32bd5-0420-437c-abe3-b568b5fced25" containerName="controller-manager" containerID="cri-o://e0c909e10ad146c5f25ddc4c306ee746d3ae6daa59a6ed6f36b3bed06976cc14" gracePeriod=30 Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.216896 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.217218 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" podUID="2401867d-7869-4633-aeeb-bfb3653c2786" containerName="route-controller-manager" containerID="cri-o://761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9" gracePeriod=30 Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.334431 4870 generic.go:334] "Generic (PLEG): container finished" podID="f0b32bd5-0420-437c-abe3-b568b5fced25" containerID="e0c909e10ad146c5f25ddc4c306ee746d3ae6daa59a6ed6f36b3bed06976cc14" exitCode=0 Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.334496 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" event={"ID":"f0b32bd5-0420-437c-abe3-b568b5fced25","Type":"ContainerDied","Data":"e0c909e10ad146c5f25ddc4c306ee746d3ae6daa59a6ed6f36b3bed06976cc14"} Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.701418 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.789372 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.821830 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca" (OuterVolumeSpecName: "client-ca") pod "2401867d-7869-4633-aeeb-bfb3653c2786" (UID: "2401867d-7869-4633-aeeb-bfb3653c2786"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.820588 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca\") pod \"2401867d-7869-4633-aeeb-bfb3653c2786\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.822086 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert\") pod \"2401867d-7869-4633-aeeb-bfb3653c2786\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.823003 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvtn\" (UniqueName: \"kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn\") pod \"2401867d-7869-4633-aeeb-bfb3653c2786\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.823431 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config\") pod \"2401867d-7869-4633-aeeb-bfb3653c2786\" (UID: \"2401867d-7869-4633-aeeb-bfb3653c2786\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.823835 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.826584 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config" (OuterVolumeSpecName: "config") pod "2401867d-7869-4633-aeeb-bfb3653c2786" (UID: "2401867d-7869-4633-aeeb-bfb3653c2786"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.830732 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2401867d-7869-4633-aeeb-bfb3653c2786" (UID: "2401867d-7869-4633-aeeb-bfb3653c2786"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.831680 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn" (OuterVolumeSpecName: "kube-api-access-lrvtn") pod "2401867d-7869-4633-aeeb-bfb3653c2786" (UID: "2401867d-7869-4633-aeeb-bfb3653c2786"). InnerVolumeSpecName "kube-api-access-lrvtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.925521 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca\") pod \"f0b32bd5-0420-437c-abe3-b568b5fced25\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.925617 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles\") pod \"f0b32bd5-0420-437c-abe3-b568b5fced25\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.925690 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2llk\" (UniqueName: \"kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk\") pod \"f0b32bd5-0420-437c-abe3-b568b5fced25\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.925711 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config\") pod \"f0b32bd5-0420-437c-abe3-b568b5fced25\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.925751 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert\") pod \"f0b32bd5-0420-437c-abe3-b568b5fced25\" (UID: \"f0b32bd5-0420-437c-abe3-b568b5fced25\") " Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.926079 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2401867d-7869-4633-aeeb-bfb3653c2786-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.926094 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvtn\" (UniqueName: \"kubernetes.io/projected/2401867d-7869-4633-aeeb-bfb3653c2786-kube-api-access-lrvtn\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.926104 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2401867d-7869-4633-aeeb-bfb3653c2786-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.926774 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f0b32bd5-0420-437c-abe3-b568b5fced25" (UID: "f0b32bd5-0420-437c-abe3-b568b5fced25"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.926957 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config" (OuterVolumeSpecName: "config") pod "f0b32bd5-0420-437c-abe3-b568b5fced25" (UID: "f0b32bd5-0420-437c-abe3-b568b5fced25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.927354 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca" (OuterVolumeSpecName: "client-ca") pod "f0b32bd5-0420-437c-abe3-b568b5fced25" (UID: "f0b32bd5-0420-437c-abe3-b568b5fced25"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.929236 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f0b32bd5-0420-437c-abe3-b568b5fced25" (UID: "f0b32bd5-0420-437c-abe3-b568b5fced25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:13:16 crc kubenswrapper[4870]: I0130 08:13:16.929446 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk" (OuterVolumeSpecName: "kube-api-access-n2llk") pod "f0b32bd5-0420-437c-abe3-b568b5fced25" (UID: "f0b32bd5-0420-437c-abe3-b568b5fced25"). InnerVolumeSpecName "kube-api-access-n2llk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.027299 4870 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.027361 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.027380 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2llk\" (UniqueName: \"kubernetes.io/projected/f0b32bd5-0420-437c-abe3-b568b5fced25-kube-api-access-n2llk\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.027399 4870 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0b32bd5-0420-437c-abe3-b568b5fced25-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.027414 4870 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0b32bd5-0420-437c-abe3-b568b5fced25-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.206919 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5"] Jan 30 08:13:17 crc kubenswrapper[4870]: E0130 08:13:17.207176 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b32bd5-0420-437c-abe3-b568b5fced25" containerName="controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.207191 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b32bd5-0420-437c-abe3-b568b5fced25" containerName="controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: E0130 08:13:17.207211 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2401867d-7869-4633-aeeb-bfb3653c2786" containerName="route-controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.207220 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2401867d-7869-4633-aeeb-bfb3653c2786" containerName="route-controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.207323 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b32bd5-0420-437c-abe3-b568b5fced25" containerName="controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.207337 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2401867d-7869-4633-aeeb-bfb3653c2786" containerName="route-controller-manager" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.207785 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.222739 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5"] Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.331453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-proxy-ca-bundles\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.331510 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-serving-cert\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.331557 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cnb\" (UniqueName: \"kubernetes.io/projected/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-kube-api-access-98cnb\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.331739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-config\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.332016 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-client-ca\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.342669 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" event={"ID":"f0b32bd5-0420-437c-abe3-b568b5fced25","Type":"ContainerDied","Data":"9ddf8fa07886379822b5d7e837b006d9d1c35dacedfc83c71d1a7c67e61b89cf"} Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.342762 4870 scope.go:117] "RemoveContainer" containerID="e0c909e10ad146c5f25ddc4c306ee746d3ae6daa59a6ed6f36b3bed06976cc14" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.342690 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c6cdccc5f-szscj" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.346535 4870 generic.go:334] "Generic (PLEG): container finished" podID="2401867d-7869-4633-aeeb-bfb3653c2786" containerID="761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9" exitCode=0 Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.346612 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" event={"ID":"2401867d-7869-4633-aeeb-bfb3653c2786","Type":"ContainerDied","Data":"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9"} Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.346710 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" event={"ID":"2401867d-7869-4633-aeeb-bfb3653c2786","Type":"ContainerDied","Data":"a4cce0bf268510416f33b5ed9e80efbf966fec2a36ba47966d7bf118bdd855e6"} Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.346648 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.362273 4870 scope.go:117] "RemoveContainer" containerID="761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.381530 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.386714 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c6cdccc5f-szscj"] Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.390665 4870 scope.go:117] "RemoveContainer" containerID="761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9" Jan 30 08:13:17 crc kubenswrapper[4870]: E0130 08:13:17.395240 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9\": container with ID starting with 761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9 not found: ID does not exist" containerID="761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.395349 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9"} err="failed to get container status \"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9\": rpc error: code = NotFound desc = could not find container \"761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9\": container with ID starting with 761594cbe8a68e508978ea78d2572aa1dd60ea6b5eadf680df6e2838d7862ec9 not found: ID does not exist" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.398607 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.403325 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-658c669f4d-v69zt"] Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.433043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-proxy-ca-bundles\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.433104 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-serving-cert\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.433150 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cnb\" (UniqueName: \"kubernetes.io/projected/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-kube-api-access-98cnb\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.433188 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-config\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.433237 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-client-ca\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.434508 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-proxy-ca-bundles\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.434516 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-client-ca\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.436584 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-config\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.443351 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-serving-cert\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.451434 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cnb\" (UniqueName: \"kubernetes.io/projected/6c410aef-bbc0-4b86-9693-8fea3d6a2b52-kube-api-access-98cnb\") pod \"controller-manager-d47b9bcf6-rxwq5\" (UID: \"6c410aef-bbc0-4b86-9693-8fea3d6a2b52\") " pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.522384 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:17 crc kubenswrapper[4870]: I0130 08:13:17.727936 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5"] Jan 30 08:13:17 crc kubenswrapper[4870]: W0130 08:13:17.736479 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c410aef_bbc0_4b86_9693_8fea3d6a2b52.slice/crio-f597b452bb789ad048934aa20d33873fc1ab56b135ee49a8010312e54a2b7a86 WatchSource:0}: Error finding container f597b452bb789ad048934aa20d33873fc1ab56b135ee49a8010312e54a2b7a86: Status 404 returned error can't find the container with id f597b452bb789ad048934aa20d33873fc1ab56b135ee49a8010312e54a2b7a86 Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.088180 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2401867d-7869-4633-aeeb-bfb3653c2786" path="/var/lib/kubelet/pods/2401867d-7869-4633-aeeb-bfb3653c2786/volumes" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.089544 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b32bd5-0420-437c-abe3-b568b5fced25" path="/var/lib/kubelet/pods/f0b32bd5-0420-437c-abe3-b568b5fced25/volumes" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.206331 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-995758d5-t7n56"] Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.207243 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.209751 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.210224 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.210585 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.211077 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.212905 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.213633 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.235949 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-995758d5-t7n56"] Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.348295 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-config\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.348385 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-client-ca\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.348463 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-serving-cert\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.348494 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d26w\" (UniqueName: \"kubernetes.io/projected/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-kube-api-access-2d26w\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.354308 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" event={"ID":"6c410aef-bbc0-4b86-9693-8fea3d6a2b52","Type":"ContainerStarted","Data":"976ddcd8c849be741323e926680bf57ea7aee95de99ac166b0471fdaa18680e5"} Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.354568 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.354668 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" event={"ID":"6c410aef-bbc0-4b86-9693-8fea3d6a2b52","Type":"ContainerStarted","Data":"f597b452bb789ad048934aa20d33873fc1ab56b135ee49a8010312e54a2b7a86"} Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.360341 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.371684 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d47b9bcf6-rxwq5" podStartSLOduration=2.37166018 podStartE2EDuration="2.37166018s" podCreationTimestamp="2026-01-30 08:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:13:18.371461544 +0000 UTC m=+237.067008653" watchObservedRunningTime="2026-01-30 08:13:18.37166018 +0000 UTC m=+237.067207299" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.450386 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-serving-cert\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.450476 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d26w\" (UniqueName: \"kubernetes.io/projected/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-kube-api-access-2d26w\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.450533 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-config\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.450585 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-client-ca\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.451754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-client-ca\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.451969 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-config\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.459774 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-serving-cert\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.468149 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d26w\" (UniqueName: \"kubernetes.io/projected/ccfb579a-39e5-4f92-bb80-ac591fe08c9d-kube-api-access-2d26w\") pod \"route-controller-manager-995758d5-t7n56\" (UID: \"ccfb579a-39e5-4f92-bb80-ac591fe08c9d\") " pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.523392 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:18 crc kubenswrapper[4870]: I0130 08:13:18.966440 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-995758d5-t7n56"] Jan 30 08:13:18 crc kubenswrapper[4870]: W0130 08:13:18.979432 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccfb579a_39e5_4f92_bb80_ac591fe08c9d.slice/crio-0e40cbbbe778ba97f80414f9b2ea05752afbc5dbc3782b4562ed7d64c2c7ac1c WatchSource:0}: Error finding container 0e40cbbbe778ba97f80414f9b2ea05752afbc5dbc3782b4562ed7d64c2c7ac1c: Status 404 returned error can't find the container with id 0e40cbbbe778ba97f80414f9b2ea05752afbc5dbc3782b4562ed7d64c2c7ac1c Jan 30 08:13:19 crc kubenswrapper[4870]: I0130 08:13:19.367469 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" event={"ID":"ccfb579a-39e5-4f92-bb80-ac591fe08c9d","Type":"ContainerStarted","Data":"09b638e3f1ceda7ea050b187880eb53408840e59511bb58cb7bdfb7a4aeced91"} Jan 30 08:13:19 crc kubenswrapper[4870]: I0130 08:13:19.368236 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" event={"ID":"ccfb579a-39e5-4f92-bb80-ac591fe08c9d","Type":"ContainerStarted","Data":"0e40cbbbe778ba97f80414f9b2ea05752afbc5dbc3782b4562ed7d64c2c7ac1c"} Jan 30 08:13:20 crc kubenswrapper[4870]: I0130 08:13:20.373647 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:20 crc kubenswrapper[4870]: I0130 08:13:20.378650 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" Jan 30 08:13:20 crc kubenswrapper[4870]: I0130 08:13:20.404247 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-995758d5-t7n56" podStartSLOduration=4.404225728 podStartE2EDuration="4.404225728s" podCreationTimestamp="2026-01-30 08:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:13:19.403023426 +0000 UTC m=+238.098570575" watchObservedRunningTime="2026-01-30 08:13:20.404225728 +0000 UTC m=+239.099772837" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.421018 4870 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.422326 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230" gracePeriod=15 Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.422507 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88" gracePeriod=15 Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.422597 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f" gracePeriod=15 Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.422492 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c" gracePeriod=15 Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.422693 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3" gracePeriod=15 Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.423675 4870 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424019 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424044 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424059 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424065 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424074 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424080 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424095 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424100 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424114 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424120 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424126 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424132 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.424138 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424144 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424260 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424269 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424284 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424293 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424301 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.424635 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.425996 4870 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.426534 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.431515 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.499714 4870 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516262 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516358 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516396 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516449 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516474 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516495 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516523 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.516577 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618267 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618345 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618376 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618413 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618431 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618448 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618474 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618515 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618603 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618650 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618673 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618696 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618715 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618736 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618928 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.618977 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: I0130 08:13:28.800959 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:28 crc kubenswrapper[4870]: E0130 08:13:28.827093 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f74143b77aeea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,LastTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.433953 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1"} Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.434389 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2ffbf6ecdfdc578ae4eb5e6146f7bb20cf934023f01259cb24c87c1a90430b78"} Jan 30 08:13:29 crc kubenswrapper[4870]: E0130 08:13:29.435717 4870 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.436892 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.438580 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.439339 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c" exitCode=0 Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.439380 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88" exitCode=0 Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.439391 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3" exitCode=0 Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.439400 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f" exitCode=2 Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.439436 4870 scope.go:117] "RemoveContainer" containerID="6fcd8a03faee764828ab3350a415aa8068f0c3d2dbb53c85fca720b11996a596" Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.442162 4870 generic.go:334] "Generic (PLEG): container finished" podID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" containerID="870ae255fc8aa69089480c5b4f44f2d48029e57db6c300a41e2ada010df31423" exitCode=0 Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.442213 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc","Type":"ContainerDied","Data":"870ae255fc8aa69089480c5b4f44f2d48029e57db6c300a41e2ada010df31423"} Jan 30 08:13:29 crc kubenswrapper[4870]: I0130 08:13:29.443104 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.452297 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.863429 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.864944 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.866104 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.866326 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.963893 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.964082 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.964119 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.964536 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.964583 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:30 crc kubenswrapper[4870]: I0130 08:13:30.964602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.010215 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.011083 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.011863 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.065920 4870 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.065969 4870 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.065980 4870 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167008 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access\") pod \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167107 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock\") pod \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167127 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir\") pod \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\" (UID: \"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc\") " Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167278 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock" (OuterVolumeSpecName: "var-lock") pod "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" (UID: "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167334 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" (UID: "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167858 4870 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.167897 4870 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.173775 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" (UID: "08cc9cc4-dd06-46a2-94c4-0b1977cd1adc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.269110 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08cc9cc4-dd06-46a2-94c4-0b1977cd1adc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.462828 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"08cc9cc4-dd06-46a2-94c4-0b1977cd1adc","Type":"ContainerDied","Data":"7bf7aacf1a3cb5782a5f9385d5b6312bd1fa309375e7e58df111c48bf3bdf731"} Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.462867 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.462917 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf7aacf1a3cb5782a5f9385d5b6312bd1fa309375e7e58df111c48bf3bdf731" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.467030 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.469723 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230" exitCode=0 Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.469795 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.469808 4870 scope.go:117] "RemoveContainer" containerID="ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.480121 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.480945 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.492647 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.493004 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.502224 4870 scope.go:117] "RemoveContainer" containerID="d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.520976 4870 scope.go:117] "RemoveContainer" containerID="8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.540747 4870 scope.go:117] "RemoveContainer" containerID="8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.570545 4870 scope.go:117] "RemoveContainer" containerID="217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.580010 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f74143b77aeea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,LastTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.591615 4870 scope.go:117] "RemoveContainer" containerID="f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.621066 4870 scope.go:117] "RemoveContainer" containerID="ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.621689 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\": container with ID starting with ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c not found: ID does not exist" containerID="ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.621743 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c"} err="failed to get container status \"ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\": rpc error: code = NotFound desc = could not find container \"ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c\": container with ID starting with ccc2d63f97db5321f9c72585cd6e562a0ca4f14963330cacfb3782e7d2dec35c not found: ID does not exist" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.621776 4870 scope.go:117] "RemoveContainer" containerID="d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.622138 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\": container with ID starting with d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88 not found: ID does not exist" containerID="d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.622176 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88"} err="failed to get container status \"d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\": rpc error: code = NotFound desc = could not find container \"d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88\": container with ID starting with d4f5020dd829f7eab4865918ab90a3e17125e207ad5a16712499dd3131536a88 not found: ID does not exist" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.622205 4870 scope.go:117] "RemoveContainer" containerID="8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.624570 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\": container with ID starting with 8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3 not found: ID does not exist" containerID="8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.624668 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3"} err="failed to get container status \"8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\": rpc error: code = NotFound desc = could not find container \"8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3\": container with ID starting with 8da738a7146edaf4ed68342ddd159850cc5a05d6ed32a6d33f88489b476a1dc3 not found: ID does not exist" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.624758 4870 scope.go:117] "RemoveContainer" containerID="8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.625122 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\": container with ID starting with 8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f not found: ID does not exist" containerID="8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.625154 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f"} err="failed to get container status \"8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\": rpc error: code = NotFound desc = could not find container \"8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f\": container with ID starting with 8b62f7006d760ccb828d749ade16e3e954ad73d4443910e839dc0da99fc7da7f not found: ID does not exist" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.625171 4870 scope.go:117] "RemoveContainer" containerID="217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.625435 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\": container with ID starting with 217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230 not found: ID does not exist" containerID="217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.625470 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230"} err="failed to get container status \"217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\": rpc error: code = NotFound desc = could not find container \"217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230\": container with ID starting with 217fcd0b4245574b2e5a6ca34fd1e2834501700b4a204fde692811209c58d230 not found: ID does not exist" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.625490 4870 scope.go:117] "RemoveContainer" containerID="f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e" Jan 30 08:13:31 crc kubenswrapper[4870]: E0130 08:13:31.625746 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\": container with ID starting with f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e not found: ID does not exist" containerID="f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e" Jan 30 08:13:31 crc kubenswrapper[4870]: I0130 08:13:31.625772 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e"} err="failed to get container status \"f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\": rpc error: code = NotFound desc = could not find container \"f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e\": container with ID starting with f53bf7dfca462f12650874f2cf41b88c4f423acdda333808c8375b1150d1b25e not found: ID does not exist" Jan 30 08:13:32 crc kubenswrapper[4870]: I0130 08:13:32.076917 4870 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:32 crc kubenswrapper[4870]: I0130 08:13:32.077202 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:32 crc kubenswrapper[4870]: I0130 08:13:32.093032 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.054927 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.055494 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.056056 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.056505 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.056985 4870 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:33 crc kubenswrapper[4870]: I0130 08:13:33.057039 4870 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.057510 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="200ms" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.258480 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="400ms" Jan 30 08:13:33 crc kubenswrapper[4870]: E0130 08:13:33.660078 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="800ms" Jan 30 08:13:34 crc kubenswrapper[4870]: E0130 08:13:34.462571 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="1.6s" Jan 30 08:13:36 crc kubenswrapper[4870]: E0130 08:13:36.064826 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="3.2s" Jan 30 08:13:39 crc kubenswrapper[4870]: E0130 08:13:39.266806 4870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" interval="6.4s" Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.064812 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.065477 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.554577 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.554655 4870 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee" exitCode=1 Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.554703 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee"} Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.555412 4870 scope.go:117] "RemoveContainer" containerID="39954930519fa4f0bd5c32bbcc70efd3789d0f7c88db456f84cb5aac9914e7ee" Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.556120 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:41 crc kubenswrapper[4870]: I0130 08:13:41.556729 4870 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:41 crc kubenswrapper[4870]: E0130 08:13:41.581434 4870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f74143b77aeea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,LastTimestamp:2026-01-30 08:13:28.825761514 +0000 UTC m=+247.521308623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.074764 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.081354 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.082341 4870 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.083414 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.084124 4870 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.109956 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.110019 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.110920 4870 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.111842 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:42 crc kubenswrapper[4870]: W0130 08:13:42.136555 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-5d6ee682a7137fd8fd43af6de87510fbef44ab62bc710438839abc6b85902c66 WatchSource:0}: Error finding container 5d6ee682a7137fd8fd43af6de87510fbef44ab62bc710438839abc6b85902c66: Status 404 returned error can't find the container with id 5d6ee682a7137fd8fd43af6de87510fbef44ab62bc710438839abc6b85902c66 Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.562273 4870 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="801bf827ba75a2bd4f1f66ac98628a28283cc9ae17abf74ddb2b42aa68294fc2" exitCode=0 Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.562383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"801bf827ba75a2bd4f1f66ac98628a28283cc9ae17abf74ddb2b42aa68294fc2"} Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.562436 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d6ee682a7137fd8fd43af6de87510fbef44ab62bc710438839abc6b85902c66"} Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.562740 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.562760 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.563127 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.563270 4870 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.563395 4870 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.568321 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:13:42Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:13:42Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:13:42Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T08:13:42Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.568615 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.569025 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.569240 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.569383 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.569428 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3a4b05f9c56e0bcb68b90d3bc04c870a8ad34240b7b8e01fe0ea0c0ff8d96966"} Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.569505 4870 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: E0130 08:13:42.569528 4870 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.570058 4870 status_manager.go:851] "Failed to get status for pod" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:42 crc kubenswrapper[4870]: I0130 08:13:42.570341 4870 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.227:6443: connect: connection refused" Jan 30 08:13:43 crc kubenswrapper[4870]: I0130 08:13:43.590641 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"18505e124e70139bc29db2f9c0c908d32a57c36d399f2bf87b4d89e5eb54791d"} Jan 30 08:13:43 crc kubenswrapper[4870]: I0130 08:13:43.591146 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"84bfff090f314b9cf4785bf9ebd2f087ce25e90005c9bf86a319e189bfd50d2f"} Jan 30 08:13:43 crc kubenswrapper[4870]: I0130 08:13:43.591160 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2dc5dfd23b2c524d1eb3a37239b6b72ef241056263a87dbdbd456e4496a40e33"} Jan 30 08:13:43 crc kubenswrapper[4870]: I0130 08:13:43.591171 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"150630c5483ac78de27ff71355e3e33875c6d5b13a7128b1084bfd4e2280ded6"} Jan 30 08:13:44 crc kubenswrapper[4870]: I0130 08:13:44.601026 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"09329e389e6466ca3f26daf4fe54c77798e955e67860273c6afe3e689d2cded1"} Jan 30 08:13:44 crc kubenswrapper[4870]: I0130 08:13:44.601270 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:44 crc kubenswrapper[4870]: I0130 08:13:44.601494 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:44 crc kubenswrapper[4870]: I0130 08:13:44.601543 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:47 crc kubenswrapper[4870]: I0130 08:13:47.112845 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:47 crc kubenswrapper[4870]: I0130 08:13:47.113914 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:47 crc kubenswrapper[4870]: I0130 08:13:47.120962 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:48 crc kubenswrapper[4870]: I0130 08:13:48.037721 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:13:48 crc kubenswrapper[4870]: I0130 08:13:48.561467 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:13:48 crc kubenswrapper[4870]: I0130 08:13:48.561973 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 08:13:48 crc kubenswrapper[4870]: I0130 08:13:48.562019 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 08:13:49 crc kubenswrapper[4870]: I0130 08:13:49.613679 4870 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:49 crc kubenswrapper[4870]: I0130 08:13:49.640534 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:49 crc kubenswrapper[4870]: I0130 08:13:49.640572 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:49 crc kubenswrapper[4870]: I0130 08:13:49.644573 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:13:50 crc kubenswrapper[4870]: I0130 08:13:50.647358 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:50 crc kubenswrapper[4870]: I0130 08:13:50.647737 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:13:52 crc kubenswrapper[4870]: I0130 08:13:52.107702 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2893a51b-5127-4be5-aa18-e3db0e84dad1" Jan 30 08:13:58 crc kubenswrapper[4870]: I0130 08:13:58.559767 4870 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 08:13:58 crc kubenswrapper[4870]: I0130 08:13:58.560861 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 08:13:59 crc kubenswrapper[4870]: I0130 08:13:59.261736 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 08:13:59 crc kubenswrapper[4870]: I0130 08:13:59.499626 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 08:13:59 crc kubenswrapper[4870]: I0130 08:13:59.657937 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 08:13:59 crc kubenswrapper[4870]: I0130 08:13:59.903243 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 08:14:00 crc kubenswrapper[4870]: I0130 08:14:00.390132 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 08:14:00 crc kubenswrapper[4870]: I0130 08:14:00.540224 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 08:14:00 crc kubenswrapper[4870]: I0130 08:14:00.643933 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 08:14:00 crc kubenswrapper[4870]: I0130 08:14:00.969636 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 08:14:00 crc kubenswrapper[4870]: I0130 08:14:00.997514 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.065330 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.156862 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.182400 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.626839 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.885112 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 08:14:01 crc kubenswrapper[4870]: I0130 08:14:01.960991 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.030139 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.038255 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.152666 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.248654 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.435935 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.462684 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.561188 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.562031 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.665294 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.805155 4870 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 08:14:02 crc kubenswrapper[4870]: I0130 08:14:02.849114 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.039204 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.044040 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.178753 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.189189 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.229598 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.333834 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.394645 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.450208 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.501667 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.526834 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.582804 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.596748 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.703004 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.716128 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.763801 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.791164 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 08:14:03 crc kubenswrapper[4870]: I0130 08:14:03.915916 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.028273 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.165534 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.212704 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.251188 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.306802 4870 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.314683 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.315025 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.315191 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6","openshift-marketplace/redhat-operators-85lwg","openshift-marketplace/redhat-marketplace-jqng8","openshift-marketplace/community-operators-cx2x5","openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.315603 4870 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.315648 4870 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aad2d5d4-4cd8-4b7e-ba10-017838ecf3ff" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.315624 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cx2x5" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="registry-server" containerID="cri-o://5d7f1da1a59f0ee841deb45c6681be28192aa5d1b0765a1de1cc229f89986ccd" gracePeriod=30 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.316063 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jqng8" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="registry-server" containerID="cri-o://5086a69b5f8df7175222c9a53597ceeaa092692fca7dd2ea0dc59c15c50cec17" gracePeriod=30 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.316109 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" containerID="cri-o://97e57ef1c7ce66b357f1ed6e1f8847cdeaccd06e556466aa82594a1548b78355" gracePeriod=30 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.316204 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-85lwg" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="registry-server" containerID="cri-o://bbc787722aa3ad5d86b9358f4c93fb7295e3213baf6f9aa990b2876df2f315f2" gracePeriod=30 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.316396 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rk4lj" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="registry-server" containerID="cri-o://9c925d71b4dfdc55925a74993dfa3447a8c069656a12a2daf0cbfcade11ab1ed" gracePeriod=30 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.336084 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.353063 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.353016566 podStartE2EDuration="15.353016566s" podCreationTimestamp="2026-01-30 08:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:14:04.350082414 +0000 UTC m=+283.045629613" watchObservedRunningTime="2026-01-30 08:14:04.353016566 +0000 UTC m=+283.048563675" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.360079 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.550710 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.571455 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.637906 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.733289 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.743951 4870 generic.go:334] "Generic (PLEG): container finished" podID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerID="5086a69b5f8df7175222c9a53597ceeaa092692fca7dd2ea0dc59c15c50cec17" exitCode=0 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.744033 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerDied","Data":"5086a69b5f8df7175222c9a53597ceeaa092692fca7dd2ea0dc59c15c50cec17"} Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.746387 4870 generic.go:334] "Generic (PLEG): container finished" podID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerID="bbc787722aa3ad5d86b9358f4c93fb7295e3213baf6f9aa990b2876df2f315f2" exitCode=0 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.746445 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerDied","Data":"bbc787722aa3ad5d86b9358f4c93fb7295e3213baf6f9aa990b2876df2f315f2"} Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.748329 4870 generic.go:334] "Generic (PLEG): container finished" podID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerID="97e57ef1c7ce66b357f1ed6e1f8847cdeaccd06e556466aa82594a1548b78355" exitCode=0 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.748374 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" event={"ID":"8ede517d-773d-4f0b-8c0a-42ae13359f95","Type":"ContainerDied","Data":"97e57ef1c7ce66b357f1ed6e1f8847cdeaccd06e556466aa82594a1548b78355"} Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.750432 4870 generic.go:334] "Generic (PLEG): container finished" podID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerID="9c925d71b4dfdc55925a74993dfa3447a8c069656a12a2daf0cbfcade11ab1ed" exitCode=0 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.750489 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerDied","Data":"9c925d71b4dfdc55925a74993dfa3447a8c069656a12a2daf0cbfcade11ab1ed"} Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.752172 4870 generic.go:334] "Generic (PLEG): container finished" podID="258d3e35-5580-4108-889c-9d5d2f80c810" containerID="5d7f1da1a59f0ee841deb45c6681be28192aa5d1b0765a1de1cc229f89986ccd" exitCode=0 Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.753061 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerDied","Data":"5d7f1da1a59f0ee841deb45c6681be28192aa5d1b0765a1de1cc229f89986ccd"} Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.828727 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.854241 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.865226 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.874471 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.955404 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.968779 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.974719 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.981328 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:14:04 crc kubenswrapper[4870]: I0130 08:14:04.993524 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.011787 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities\") pod \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.011909 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz6vk\" (UniqueName: \"kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk\") pod \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.011985 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content\") pod \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\" (UID: \"56cb5ce8-da4f-4c24-9805-18a91b316bcd\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.013856 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities" (OuterVolumeSpecName: "utilities") pod "56cb5ce8-da4f-4c24-9805-18a91b316bcd" (UID: "56cb5ce8-da4f-4c24-9805-18a91b316bcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.021045 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk" (OuterVolumeSpecName: "kube-api-access-nz6vk") pod "56cb5ce8-da4f-4c24-9805-18a91b316bcd" (UID: "56cb5ce8-da4f-4c24-9805-18a91b316bcd"). InnerVolumeSpecName "kube-api-access-nz6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.037098 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56cb5ce8-da4f-4c24-9805-18a91b316bcd" (UID: "56cb5ce8-da4f-4c24-9805-18a91b316bcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.073743 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.107700 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.113914 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44g22\" (UniqueName: \"kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22\") pod \"258d3e35-5580-4108-889c-9d5d2f80c810\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114007 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content\") pod \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114384 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities\") pod \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114436 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmwtm\" (UniqueName: \"kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm\") pod \"8ede517d-773d-4f0b-8c0a-42ae13359f95\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114537 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content\") pod \"258d3e35-5580-4108-889c-9d5d2f80c810\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114570 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics\") pod \"8ede517d-773d-4f0b-8c0a-42ae13359f95\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg24l\" (UniqueName: \"kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l\") pod \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\" (UID: \"ba2950a4-e1b9-45a9-9980-1b4169e0fb16\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114641 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drv8z\" (UniqueName: \"kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z\") pod \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114690 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca\") pod \"8ede517d-773d-4f0b-8c0a-42ae13359f95\" (UID: \"8ede517d-773d-4f0b-8c0a-42ae13359f95\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114709 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities\") pod \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.114738 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities\") pod \"258d3e35-5580-4108-889c-9d5d2f80c810\" (UID: \"258d3e35-5580-4108-889c-9d5d2f80c810\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.115927 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities" (OuterVolumeSpecName: "utilities") pod "1d50529a-bc06-49a9-a5bf-64e91e8734c2" (UID: "1d50529a-bc06-49a9-a5bf-64e91e8734c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.116196 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8ede517d-773d-4f0b-8c0a-42ae13359f95" (UID: "8ede517d-773d-4f0b-8c0a-42ae13359f95"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.116438 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities" (OuterVolumeSpecName: "utilities") pod "ba2950a4-e1b9-45a9-9980-1b4169e0fb16" (UID: "ba2950a4-e1b9-45a9-9980-1b4169e0fb16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.116457 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content\") pod \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\" (UID: \"1d50529a-bc06-49a9-a5bf-64e91e8734c2\") " Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.116606 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities" (OuterVolumeSpecName: "utilities") pod "258d3e35-5580-4108-889c-9d5d2f80c810" (UID: "258d3e35-5580-4108-889c-9d5d2f80c810"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.116703 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22" (OuterVolumeSpecName: "kube-api-access-44g22") pod "258d3e35-5580-4108-889c-9d5d2f80c810" (UID: "258d3e35-5580-4108-889c-9d5d2f80c810"). InnerVolumeSpecName "kube-api-access-44g22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117657 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117689 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117713 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117727 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117740 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44g22\" (UniqueName: \"kubernetes.io/projected/258d3e35-5580-4108-889c-9d5d2f80c810-kube-api-access-44g22\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117752 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117764 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz6vk\" (UniqueName: \"kubernetes.io/projected/56cb5ce8-da4f-4c24-9805-18a91b316bcd-kube-api-access-nz6vk\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.117776 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56cb5ce8-da4f-4c24-9805-18a91b316bcd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.121776 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8ede517d-773d-4f0b-8c0a-42ae13359f95" (UID: "8ede517d-773d-4f0b-8c0a-42ae13359f95"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.121853 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z" (OuterVolumeSpecName: "kube-api-access-drv8z") pod "1d50529a-bc06-49a9-a5bf-64e91e8734c2" (UID: "1d50529a-bc06-49a9-a5bf-64e91e8734c2"). InnerVolumeSpecName "kube-api-access-drv8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.121910 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm" (OuterVolumeSpecName: "kube-api-access-qmwtm") pod "8ede517d-773d-4f0b-8c0a-42ae13359f95" (UID: "8ede517d-773d-4f0b-8c0a-42ae13359f95"). InnerVolumeSpecName "kube-api-access-qmwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.122402 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l" (OuterVolumeSpecName: "kube-api-access-rg24l") pod "ba2950a4-e1b9-45a9-9980-1b4169e0fb16" (UID: "ba2950a4-e1b9-45a9-9980-1b4169e0fb16"). InnerVolumeSpecName "kube-api-access-rg24l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.161744 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.182941 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "258d3e35-5580-4108-889c-9d5d2f80c810" (UID: "258d3e35-5580-4108-889c-9d5d2f80c810"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.187205 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.192523 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba2950a4-e1b9-45a9-9980-1b4169e0fb16" (UID: "ba2950a4-e1b9-45a9-9980-1b4169e0fb16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219082 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/258d3e35-5580-4108-889c-9d5d2f80c810-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219117 4870 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ede517d-773d-4f0b-8c0a-42ae13359f95-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219133 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg24l\" (UniqueName: \"kubernetes.io/projected/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-kube-api-access-rg24l\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219146 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drv8z\" (UniqueName: \"kubernetes.io/projected/1d50529a-bc06-49a9-a5bf-64e91e8734c2-kube-api-access-drv8z\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219162 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2950a4-e1b9-45a9-9980-1b4169e0fb16-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.219175 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmwtm\" (UniqueName: \"kubernetes.io/projected/8ede517d-773d-4f0b-8c0a-42ae13359f95-kube-api-access-qmwtm\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.240689 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.261242 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d50529a-bc06-49a9-a5bf-64e91e8734c2" (UID: "1d50529a-bc06-49a9-a5bf-64e91e8734c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.321103 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d50529a-bc06-49a9-a5bf-64e91e8734c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.325905 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.390138 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.400464 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.499373 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.556060 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.559782 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.584696 4870 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.644384 4870 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.719224 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.733633 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.761557 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk4lj" event={"ID":"ba2950a4-e1b9-45a9-9980-1b4169e0fb16","Type":"ContainerDied","Data":"b4deb94680d10a0e49b737adc1e5d0d479b58878615ce9ba8009bd204fb58e39"} Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.761678 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk4lj" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.761734 4870 scope.go:117] "RemoveContainer" containerID="9c925d71b4dfdc55925a74993dfa3447a8c069656a12a2daf0cbfcade11ab1ed" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.764419 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx2x5" event={"ID":"258d3e35-5580-4108-889c-9d5d2f80c810","Type":"ContainerDied","Data":"ea85190d876bcbca144726c237a14b6d31ba3248e8f165a1e622d666e72b6022"} Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.764549 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx2x5" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.768539 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqng8" event={"ID":"56cb5ce8-da4f-4c24-9805-18a91b316bcd","Type":"ContainerDied","Data":"5318a5759e8a4ecffb11be37d9689df0b960dc674f99fd5d3cb764e4f3066de3"} Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.768557 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqng8" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.771855 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-85lwg" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.771869 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-85lwg" event={"ID":"1d50529a-bc06-49a9-a5bf-64e91e8734c2","Type":"ContainerDied","Data":"1b14874ab64bd9943b3954bf834f4ae30ab6a234601d5bd7fe08c6631f1c0819"} Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.774238 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" event={"ID":"8ede517d-773d-4f0b-8c0a-42ae13359f95","Type":"ContainerDied","Data":"3b948b615fba724f1687e73e5fcca06ca297443c072f9ebaf1a3471eb522792b"} Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.774938 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jh9j6" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.776778 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.788919 4870 scope.go:117] "RemoveContainer" containerID="4e94e4129ecab37de0297dde4dc86e9ac30e8fda6a11f59af65a8c199b125d87" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.825018 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.835957 4870 scope.go:117] "RemoveContainer" containerID="2e15cf3e43d60efa400786600f10aabddcac1a402cf20155c96332c4d505ad73" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.837856 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rk4lj"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.842364 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqng8"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.845494 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqng8"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.849110 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.852218 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jh9j6"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.857536 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx2x5"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.860379 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cx2x5"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.864180 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-85lwg"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.866865 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-85lwg"] Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.868365 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.879655 4870 scope.go:117] "RemoveContainer" containerID="5d7f1da1a59f0ee841deb45c6681be28192aa5d1b0765a1de1cc229f89986ccd" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.895025 4870 scope.go:117] "RemoveContainer" containerID="31bdc406d04a8518a48f85291f438714500a3199ef4565a4e1bcc218ea393cac" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.903469 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.914475 4870 scope.go:117] "RemoveContainer" containerID="b8e1ab4ce4d07cf81dd3964239182751d6d8a8cb595e0cabe44b1efd32e0f612" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.930182 4870 scope.go:117] "RemoveContainer" containerID="5086a69b5f8df7175222c9a53597ceeaa092692fca7dd2ea0dc59c15c50cec17" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.946907 4870 scope.go:117] "RemoveContainer" containerID="6734abf7e123160f7f9ec15e63bcacb2803b7e9b5f597cb9ce9439f6abad0e28" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.963782 4870 scope.go:117] "RemoveContainer" containerID="8a3a4ecde2801a20f3bb4ccdc68bab1d46b831e5569a15eb1e5876330bbb7d42" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.978977 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.982536 4870 scope.go:117] "RemoveContainer" containerID="bbc787722aa3ad5d86b9358f4c93fb7295e3213baf6f9aa990b2876df2f315f2" Jan 30 08:14:05 crc kubenswrapper[4870]: I0130 08:14:05.997480 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.002164 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.002463 4870 scope.go:117] "RemoveContainer" containerID="d0dc443c5c9b20693d4448270af7993e64d959e03cda7b880c0de95b2ee5007b" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.006861 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.010393 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.022665 4870 scope.go:117] "RemoveContainer" containerID="f4984448372f3c99bd2eb627d2f6a37eee0cab48c315336c3d5192e15f6bb85e" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.042634 4870 scope.go:117] "RemoveContainer" containerID="97e57ef1c7ce66b357f1ed6e1f8847cdeaccd06e556466aa82594a1548b78355" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.071107 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.085414 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" path="/var/lib/kubelet/pods/1d50529a-bc06-49a9-a5bf-64e91e8734c2/volumes" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.086362 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" path="/var/lib/kubelet/pods/258d3e35-5580-4108-889c-9d5d2f80c810/volumes" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.087439 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" path="/var/lib/kubelet/pods/56cb5ce8-da4f-4c24-9805-18a91b316bcd/volumes" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.089063 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" path="/var/lib/kubelet/pods/8ede517d-773d-4f0b-8c0a-42ae13359f95/volumes" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.089722 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" path="/var/lib/kubelet/pods/ba2950a4-e1b9-45a9-9980-1b4169e0fb16/volumes" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.392300 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.405715 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.406529 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.422279 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.428067 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.581244 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.588988 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.617912 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.686250 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.693623 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.781816 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.895503 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 08:14:06 crc kubenswrapper[4870]: I0130 08:14:06.987072 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.032009 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.203520 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.243966 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.248192 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.369258 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.375825 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.486542 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.527764 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.558727 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.566326 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.598417 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.626679 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.679205 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.713720 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.757733 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.758162 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.814141 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.814294 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.908333 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 08:14:07 crc kubenswrapper[4870]: I0130 08:14:07.909528 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.019513 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.070413 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.086033 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.139200 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.157207 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.271687 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.293353 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.368325 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.420259 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.472279 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.548140 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.548855 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.556787 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.561795 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.567596 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.569744 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.633690 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.710603 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.742782 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.811040 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.851274 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.886840 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.896910 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.991182 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 08:14:08 crc kubenswrapper[4870]: I0130 08:14:08.993505 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.015527 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.100843 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.145453 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.154750 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.164253 4870 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.218480 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.222783 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.224397 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.245026 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.260578 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.304110 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.422927 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.445912 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.457032 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.481517 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.495059 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.513020 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.516442 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.573017 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.675066 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.742026 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.755668 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.774990 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.795166 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.804171 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.809183 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.835524 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.844676 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.845673 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.943226 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.952441 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 08:14:09 crc kubenswrapper[4870]: I0130 08:14:09.974653 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.107698 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.181019 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.187170 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.208296 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.225964 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.344245 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.455202 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.671107 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.677387 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.701500 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.741590 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.760709 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.893126 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.915120 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.919055 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 08:14:10 crc kubenswrapper[4870]: I0130 08:14:10.980938 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.019897 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.034734 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.086201 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.102831 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.274003 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.350763 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.406409 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.467994 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.523692 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.614173 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.707807 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.815743 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.844164 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.854246 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 08:14:11 crc kubenswrapper[4870]: I0130 08:14:11.957773 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.022526 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.030047 4870 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.030373 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1" gracePeriod=5 Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.245839 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.275096 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.363409 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.378826 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.470396 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.696902 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.727072 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.749812 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.754244 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.769999 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.834081 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.838705 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.947928 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 08:14:12 crc kubenswrapper[4870]: I0130 08:14:12.973988 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.127900 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.183471 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.390623 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.430365 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.457805 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.480616 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.583955 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.612825 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.623141 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.710619 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 08:14:13 crc kubenswrapper[4870]: I0130 08:14:13.988606 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.037597 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.055215 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.137723 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.207852 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.456835 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.471893 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.484833 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.653853 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.755792 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.902894 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 08:14:14 crc kubenswrapper[4870]: I0130 08:14:14.904578 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 08:14:15 crc kubenswrapper[4870]: I0130 08:14:15.199688 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 08:14:15 crc kubenswrapper[4870]: I0130 08:14:15.350797 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 08:14:15 crc kubenswrapper[4870]: I0130 08:14:15.527667 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 08:14:16 crc kubenswrapper[4870]: I0130 08:14:16.233913 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 08:14:16 crc kubenswrapper[4870]: I0130 08:14:16.385611 4870 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 08:14:16 crc kubenswrapper[4870]: I0130 08:14:16.512727 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 08:14:16 crc kubenswrapper[4870]: I0130 08:14:16.735202 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 08:14:16 crc kubenswrapper[4870]: I0130 08:14:16.777444 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.630806 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.631413 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.732723 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.732921 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.732926 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.732961 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733020 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733177 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733227 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733338 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733418 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733692 4870 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733718 4870 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733733 4870 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.733746 4870 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.742317 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.834902 4870 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.869357 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.869416 4870 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1" exitCode=137 Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.869465 4870 scope.go:117] "RemoveContainer" containerID="c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.869630 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.888975 4870 scope.go:117] "RemoveContainer" containerID="c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1" Jan 30 08:14:17 crc kubenswrapper[4870]: E0130 08:14:17.889637 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1\": container with ID starting with c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1 not found: ID does not exist" containerID="c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1" Jan 30 08:14:17 crc kubenswrapper[4870]: I0130 08:14:17.889709 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1"} err="failed to get container status \"c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1\": rpc error: code = NotFound desc = could not find container \"c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1\": container with ID starting with c0c72089a10d44666fb67e168c5b95e04bd873513b7b27b2f406fcf4e49ed8d1 not found: ID does not exist" Jan 30 08:14:18 crc kubenswrapper[4870]: I0130 08:14:18.082861 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 08:14:18 crc kubenswrapper[4870]: I0130 08:14:18.341732 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.033249 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkhzd"] Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034045 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" containerName="installer" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034062 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" containerName="installer" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034073 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034082 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034096 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034104 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034112 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034118 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034128 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034134 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034141 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034148 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034158 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034164 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034174 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034182 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034192 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034198 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034237 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034244 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034258 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034264 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034274 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034281 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034287 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034293 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034300 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034308 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="extract-content" Jan 30 08:14:19 crc kubenswrapper[4870]: E0130 08:14:19.034316 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034323 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="extract-utilities" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034413 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d50529a-bc06-49a9-a5bf-64e91e8734c2" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034425 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="56cb5ce8-da4f-4c24-9805-18a91b316bcd" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034433 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="258d3e35-5580-4108-889c-9d5d2f80c810" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034442 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="08cc9cc4-dd06-46a2-94c4-0b1977cd1adc" containerName="installer" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034451 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2950a4-e1b9-45a9-9980-1b4169e0fb16" containerName="registry-server" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034460 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ede517d-773d-4f0b-8c0a-42ae13359f95" containerName="marketplace-operator" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034468 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.034985 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.040210 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.041806 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.042043 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.056386 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.070127 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.086909 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkhzd"] Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.170032 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.170100 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthr6\" (UniqueName: \"kubernetes.io/projected/83d46dd9-5ab7-44c9-b032-1241911b6d82-kube-api-access-gthr6\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.170152 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.271795 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthr6\" (UniqueName: \"kubernetes.io/projected/83d46dd9-5ab7-44c9-b032-1241911b6d82-kube-api-access-gthr6\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.271857 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.271935 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.273518 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.277298 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83d46dd9-5ab7-44c9-b032-1241911b6d82-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.305928 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthr6\" (UniqueName: \"kubernetes.io/projected/83d46dd9-5ab7-44c9-b032-1241911b6d82-kube-api-access-gthr6\") pod \"marketplace-operator-79b997595-vkhzd\" (UID: \"83d46dd9-5ab7-44c9-b032-1241911b6d82\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.352940 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.840526 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkhzd"] Jan 30 08:14:19 crc kubenswrapper[4870]: W0130 08:14:19.847284 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d46dd9_5ab7_44c9_b032_1241911b6d82.slice/crio-fb38729d476b3402e5fe7c9ab67d74ba742cb40594d8331879bd3cf0e116ee78 WatchSource:0}: Error finding container fb38729d476b3402e5fe7c9ab67d74ba742cb40594d8331879bd3cf0e116ee78: Status 404 returned error can't find the container with id fb38729d476b3402e5fe7c9ab67d74ba742cb40594d8331879bd3cf0e116ee78 Jan 30 08:14:19 crc kubenswrapper[4870]: I0130 08:14:19.884638 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" event={"ID":"83d46dd9-5ab7-44c9-b032-1241911b6d82","Type":"ContainerStarted","Data":"fb38729d476b3402e5fe7c9ab67d74ba742cb40594d8331879bd3cf0e116ee78"} Jan 30 08:14:20 crc kubenswrapper[4870]: I0130 08:14:20.892253 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" event={"ID":"83d46dd9-5ab7-44c9-b032-1241911b6d82","Type":"ContainerStarted","Data":"fe96c7d47a996df0fd9fed7c61d5a9257f0f30c59f02d890063b39690be17911"} Jan 30 08:14:20 crc kubenswrapper[4870]: I0130 08:14:20.892569 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:20 crc kubenswrapper[4870]: I0130 08:14:20.897743 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" Jan 30 08:14:20 crc kubenswrapper[4870]: I0130 08:14:20.913464 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vkhzd" podStartSLOduration=1.913445574 podStartE2EDuration="1.913445574s" podCreationTimestamp="2026-01-30 08:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:14:20.912895557 +0000 UTC m=+299.608442656" watchObservedRunningTime="2026-01-30 08:14:20.913445574 +0000 UTC m=+299.608992683" Jan 30 08:14:21 crc kubenswrapper[4870]: I0130 08:14:21.856660 4870 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.190912 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc"] Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.192224 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.196429 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.197556 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.206182 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc"] Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.350318 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.350391 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d2tx\" (UniqueName: \"kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.350448 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.451758 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.451810 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d2tx\" (UniqueName: \"kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.451859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.452799 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.459359 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.469141 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d2tx\" (UniqueName: \"kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx\") pod \"collect-profiles-29496015-h7vrc\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:00 crc kubenswrapper[4870]: I0130 08:15:00.527093 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:01 crc kubenswrapper[4870]: I0130 08:15:01.013404 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc"] Jan 30 08:15:01 crc kubenswrapper[4870]: I0130 08:15:01.182985 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" event={"ID":"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409","Type":"ContainerStarted","Data":"8d0d384940320938f76b4606a7945e7e85fc7299d14ce535b306384ff5a56415"} Jan 30 08:15:02 crc kubenswrapper[4870]: I0130 08:15:02.192625 4870 generic.go:334] "Generic (PLEG): container finished" podID="84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" containerID="906fa4603bfe71976f941c25c726c6a5f3b1b9c0bede621580c2910f359fd6f2" exitCode=0 Jan 30 08:15:02 crc kubenswrapper[4870]: I0130 08:15:02.192707 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" event={"ID":"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409","Type":"ContainerDied","Data":"906fa4603bfe71976f941c25c726c6a5f3b1b9c0bede621580c2910f359fd6f2"} Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.508612 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.550637 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d2tx\" (UniqueName: \"kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx\") pod \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.550735 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume\") pod \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.550849 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume\") pod \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\" (UID: \"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409\") " Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.551966 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume" (OuterVolumeSpecName: "config-volume") pod "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" (UID: "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.560104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" (UID: "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.560129 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx" (OuterVolumeSpecName: "kube-api-access-7d2tx") pod "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" (UID: "84ecc6b5-f0f3-40b1-ba86-24eabdbdc409"). InnerVolumeSpecName "kube-api-access-7d2tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.652997 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.653058 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d2tx\" (UniqueName: \"kubernetes.io/projected/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-kube-api-access-7d2tx\") on node \"crc\" DevicePath \"\"" Jan 30 08:15:03 crc kubenswrapper[4870]: I0130 08:15:03.653078 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.215964 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" event={"ID":"84ecc6b5-f0f3-40b1-ba86-24eabdbdc409","Type":"ContainerDied","Data":"8d0d384940320938f76b4606a7945e7e85fc7299d14ce535b306384ff5a56415"} Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.216400 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0d384940320938f76b4606a7945e7e85fc7299d14ce535b306384ff5a56415" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.216066 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.496813 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-glxrr"] Jan 30 08:15:04 crc kubenswrapper[4870]: E0130 08:15:04.497171 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" containerName="collect-profiles" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.497194 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" containerName="collect-profiles" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.497426 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" containerName="collect-profiles" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.498768 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.502932 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.567582 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74668\" (UniqueName: \"kubernetes.io/projected/b1839882-74e1-4c94-9d83-849d10c41089-kube-api-access-74668\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.567692 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-catalog-content\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.568004 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-utilities\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.570766 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-glxrr"] Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.668984 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-catalog-content\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.669059 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-utilities\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.669137 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74668\" (UniqueName: \"kubernetes.io/projected/b1839882-74e1-4c94-9d83-849d10c41089-kube-api-access-74668\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.669944 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-utilities\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.670000 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1839882-74e1-4c94-9d83-849d10c41089-catalog-content\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.681976 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8dmqx"] Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.683310 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.685915 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.694788 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74668\" (UniqueName: \"kubernetes.io/projected/b1839882-74e1-4c94-9d83-849d10c41089-kube-api-access-74668\") pod \"redhat-marketplace-glxrr\" (UID: \"b1839882-74e1-4c94-9d83-849d10c41089\") " pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.703389 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dmqx"] Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.770034 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvqn\" (UniqueName: \"kubernetes.io/projected/71b77216-d7c7-4a69-8596-e64fd99129c6-kube-api-access-4rvqn\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.770078 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-catalog-content\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.770131 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-utilities\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.825262 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.871547 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-utilities\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.871710 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvqn\" (UniqueName: \"kubernetes.io/projected/71b77216-d7c7-4a69-8596-e64fd99129c6-kube-api-access-4rvqn\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.871761 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-catalog-content\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.872382 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-utilities\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.872716 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b77216-d7c7-4a69-8596-e64fd99129c6-catalog-content\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:04 crc kubenswrapper[4870]: I0130 08:15:04.899220 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvqn\" (UniqueName: \"kubernetes.io/projected/71b77216-d7c7-4a69-8596-e64fd99129c6-kube-api-access-4rvqn\") pod \"redhat-operators-8dmqx\" (UID: \"71b77216-d7c7-4a69-8596-e64fd99129c6\") " pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:05 crc kubenswrapper[4870]: I0130 08:15:05.039711 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:05 crc kubenswrapper[4870]: I0130 08:15:05.116274 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-glxrr"] Jan 30 08:15:05 crc kubenswrapper[4870]: W0130 08:15:05.127194 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1839882_74e1_4c94_9d83_849d10c41089.slice/crio-5cfbdd285fb5a1b266382a405cf50ff0fb9a35f550f2ebe363c8d626d46c03f9 WatchSource:0}: Error finding container 5cfbdd285fb5a1b266382a405cf50ff0fb9a35f550f2ebe363c8d626d46c03f9: Status 404 returned error can't find the container with id 5cfbdd285fb5a1b266382a405cf50ff0fb9a35f550f2ebe363c8d626d46c03f9 Jan 30 08:15:05 crc kubenswrapper[4870]: I0130 08:15:05.229015 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glxrr" event={"ID":"b1839882-74e1-4c94-9d83-849d10c41089","Type":"ContainerStarted","Data":"5cfbdd285fb5a1b266382a405cf50ff0fb9a35f550f2ebe363c8d626d46c03f9"} Jan 30 08:15:05 crc kubenswrapper[4870]: I0130 08:15:05.448470 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dmqx"] Jan 30 08:15:05 crc kubenswrapper[4870]: W0130 08:15:05.450472 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b77216_d7c7_4a69_8596_e64fd99129c6.slice/crio-76ce4bf2294f97bc51fdcf08978665956a8ea8777d7c21af9eb40ef33d40e3f3 WatchSource:0}: Error finding container 76ce4bf2294f97bc51fdcf08978665956a8ea8777d7c21af9eb40ef33d40e3f3: Status 404 returned error can't find the container with id 76ce4bf2294f97bc51fdcf08978665956a8ea8777d7c21af9eb40ef33d40e3f3 Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.250791 4870 generic.go:334] "Generic (PLEG): container finished" podID="b1839882-74e1-4c94-9d83-849d10c41089" containerID="0fb6f743286c6bf6e84f87f46aac4248d0b275d919c7a9ed98a9102f025aaeae" exitCode=0 Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.251164 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glxrr" event={"ID":"b1839882-74e1-4c94-9d83-849d10c41089","Type":"ContainerDied","Data":"0fb6f743286c6bf6e84f87f46aac4248d0b275d919c7a9ed98a9102f025aaeae"} Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.257777 4870 generic.go:334] "Generic (PLEG): container finished" podID="71b77216-d7c7-4a69-8596-e64fd99129c6" containerID="f7558377abf46275ac5cf8b97589797380793e1e66c29e69aa698b670a7ac33c" exitCode=0 Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.257898 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dmqx" event={"ID":"71b77216-d7c7-4a69-8596-e64fd99129c6","Type":"ContainerDied","Data":"f7558377abf46275ac5cf8b97589797380793e1e66c29e69aa698b670a7ac33c"} Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.257958 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dmqx" event={"ID":"71b77216-d7c7-4a69-8596-e64fd99129c6","Type":"ContainerStarted","Data":"76ce4bf2294f97bc51fdcf08978665956a8ea8777d7c21af9eb40ef33d40e3f3"} Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.885317 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whfhw"] Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.886490 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.893518 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 08:15:06 crc kubenswrapper[4870]: I0130 08:15:06.902818 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whfhw"] Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.009799 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-utilities\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.009973 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-catalog-content\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.010079 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbkb\" (UniqueName: \"kubernetes.io/projected/ea80eb92-6881-4e69-8ca2-050d32254eb7-kube-api-access-gzbkb\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.081043 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mqxgq"] Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.082175 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.084663 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.090314 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mqxgq"] Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.111792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbkb\" (UniqueName: \"kubernetes.io/projected/ea80eb92-6881-4e69-8ca2-050d32254eb7-kube-api-access-gzbkb\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.111865 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-utilities\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.111926 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-catalog-content\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.112447 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-catalog-content\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.112921 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea80eb92-6881-4e69-8ca2-050d32254eb7-utilities\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.133253 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbkb\" (UniqueName: \"kubernetes.io/projected/ea80eb92-6881-4e69-8ca2-050d32254eb7-kube-api-access-gzbkb\") pod \"certified-operators-whfhw\" (UID: \"ea80eb92-6881-4e69-8ca2-050d32254eb7\") " pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.213407 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-utilities\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.213473 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqtwk\" (UniqueName: \"kubernetes.io/projected/d7b3d065-5057-49c1-be84-7880d7d4d619-kube-api-access-bqtwk\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.213495 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-catalog-content\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.215648 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.272087 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dmqx" event={"ID":"71b77216-d7c7-4a69-8596-e64fd99129c6","Type":"ContainerStarted","Data":"16f0185709325c50e5b32b57ecacced21f287c87b7c1519e6c80bab8c73d585a"} Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.320010 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-utilities\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.320098 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqtwk\" (UniqueName: \"kubernetes.io/projected/d7b3d065-5057-49c1-be84-7880d7d4d619-kube-api-access-bqtwk\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.320121 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-catalog-content\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.320681 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-catalog-content\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.320983 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b3d065-5057-49c1-be84-7880d7d4d619-utilities\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.343449 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqtwk\" (UniqueName: \"kubernetes.io/projected/d7b3d065-5057-49c1-be84-7880d7d4d619-kube-api-access-bqtwk\") pod \"community-operators-mqxgq\" (UID: \"d7b3d065-5057-49c1-be84-7880d7d4d619\") " pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.405394 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.630683 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whfhw"] Jan 30 08:15:07 crc kubenswrapper[4870]: I0130 08:15:07.788987 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mqxgq"] Jan 30 08:15:07 crc kubenswrapper[4870]: W0130 08:15:07.854337 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b3d065_5057_49c1_be84_7880d7d4d619.slice/crio-9bc7725a1aac8f3ce2d5c5ec8dbf24934b3bfdd442849e0ac029db29662fc649 WatchSource:0}: Error finding container 9bc7725a1aac8f3ce2d5c5ec8dbf24934b3bfdd442849e0ac029db29662fc649: Status 404 returned error can't find the container with id 9bc7725a1aac8f3ce2d5c5ec8dbf24934b3bfdd442849e0ac029db29662fc649 Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.287286 4870 generic.go:334] "Generic (PLEG): container finished" podID="71b77216-d7c7-4a69-8596-e64fd99129c6" containerID="16f0185709325c50e5b32b57ecacced21f287c87b7c1519e6c80bab8c73d585a" exitCode=0 Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.287362 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dmqx" event={"ID":"71b77216-d7c7-4a69-8596-e64fd99129c6","Type":"ContainerDied","Data":"16f0185709325c50e5b32b57ecacced21f287c87b7c1519e6c80bab8c73d585a"} Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.289658 4870 generic.go:334] "Generic (PLEG): container finished" podID="d7b3d065-5057-49c1-be84-7880d7d4d619" containerID="53ccff46255318b58b07d04b23d8f25bcd2e7063e9cb3336eb3d8abf6464ba57" exitCode=0 Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.289743 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqxgq" event={"ID":"d7b3d065-5057-49c1-be84-7880d7d4d619","Type":"ContainerDied","Data":"53ccff46255318b58b07d04b23d8f25bcd2e7063e9cb3336eb3d8abf6464ba57"} Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.289781 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqxgq" event={"ID":"d7b3d065-5057-49c1-be84-7880d7d4d619","Type":"ContainerStarted","Data":"9bc7725a1aac8f3ce2d5c5ec8dbf24934b3bfdd442849e0ac029db29662fc649"} Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.293623 4870 generic.go:334] "Generic (PLEG): container finished" podID="ea80eb92-6881-4e69-8ca2-050d32254eb7" containerID="4f4db8280ca4d0135958943e8472d2d7a5a94788e4391049ca7cb7386c1ecee3" exitCode=0 Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.293862 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfhw" event={"ID":"ea80eb92-6881-4e69-8ca2-050d32254eb7","Type":"ContainerDied","Data":"4f4db8280ca4d0135958943e8472d2d7a5a94788e4391049ca7cb7386c1ecee3"} Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.293949 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfhw" event={"ID":"ea80eb92-6881-4e69-8ca2-050d32254eb7","Type":"ContainerStarted","Data":"cf62dd867dfdc85f00ce0a625b67208af5a508841846bf5274a9dc74b568f567"} Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.301744 4870 generic.go:334] "Generic (PLEG): container finished" podID="b1839882-74e1-4c94-9d83-849d10c41089" containerID="cc14485a338ac264a7ea890f0b0accaa3f3a33b6c65407cec7c7b0303baf5081" exitCode=0 Jan 30 08:15:08 crc kubenswrapper[4870]: I0130 08:15:08.301827 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glxrr" event={"ID":"b1839882-74e1-4c94-9d83-849d10c41089","Type":"ContainerDied","Data":"cc14485a338ac264a7ea890f0b0accaa3f3a33b6c65407cec7c7b0303baf5081"} Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.311429 4870 generic.go:334] "Generic (PLEG): container finished" podID="d7b3d065-5057-49c1-be84-7880d7d4d619" containerID="53bbdcc8bd93ea4af16456df8b4618db541529c682064431c80b1ace4a00e00a" exitCode=0 Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.311589 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqxgq" event={"ID":"d7b3d065-5057-49c1-be84-7880d7d4d619","Type":"ContainerDied","Data":"53bbdcc8bd93ea4af16456df8b4618db541529c682064431c80b1ace4a00e00a"} Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.317190 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfhw" event={"ID":"ea80eb92-6881-4e69-8ca2-050d32254eb7","Type":"ContainerStarted","Data":"2b0c1f13038b197a88d12acd9fb107eaedab3767def340d7b9af7dcc855bef2a"} Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.320487 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-glxrr" event={"ID":"b1839882-74e1-4c94-9d83-849d10c41089","Type":"ContainerStarted","Data":"c9149def346396f297e19fc2caa4ef54779bdeb0935c4d325d7957e05e13cbc7"} Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.325192 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dmqx" event={"ID":"71b77216-d7c7-4a69-8596-e64fd99129c6","Type":"ContainerStarted","Data":"64afd883eaf0caf5531d3c234b5983ac6112855435b1120c5aa207280d615f87"} Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.362708 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8dmqx" podStartSLOduration=2.935283677 podStartE2EDuration="5.362682577s" podCreationTimestamp="2026-01-30 08:15:04 +0000 UTC" firstStartedPulling="2026-01-30 08:15:06.259862597 +0000 UTC m=+344.955409746" lastFinishedPulling="2026-01-30 08:15:08.687261527 +0000 UTC m=+347.382808646" observedRunningTime="2026-01-30 08:15:09.356078861 +0000 UTC m=+348.051625970" watchObservedRunningTime="2026-01-30 08:15:09.362682577 +0000 UTC m=+348.058229676" Jan 30 08:15:09 crc kubenswrapper[4870]: I0130 08:15:09.395637 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-glxrr" podStartSLOduration=2.725026579 podStartE2EDuration="5.395594767s" podCreationTimestamp="2026-01-30 08:15:04 +0000 UTC" firstStartedPulling="2026-01-30 08:15:06.255698486 +0000 UTC m=+344.951245625" lastFinishedPulling="2026-01-30 08:15:08.926266704 +0000 UTC m=+347.621813813" observedRunningTime="2026-01-30 08:15:09.392726288 +0000 UTC m=+348.088273397" watchObservedRunningTime="2026-01-30 08:15:09.395594767 +0000 UTC m=+348.091141916" Jan 30 08:15:10 crc kubenswrapper[4870]: I0130 08:15:10.333540 4870 generic.go:334] "Generic (PLEG): container finished" podID="ea80eb92-6881-4e69-8ca2-050d32254eb7" containerID="2b0c1f13038b197a88d12acd9fb107eaedab3767def340d7b9af7dcc855bef2a" exitCode=0 Jan 30 08:15:10 crc kubenswrapper[4870]: I0130 08:15:10.333581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfhw" event={"ID":"ea80eb92-6881-4e69-8ca2-050d32254eb7","Type":"ContainerDied","Data":"2b0c1f13038b197a88d12acd9fb107eaedab3767def340d7b9af7dcc855bef2a"} Jan 30 08:15:10 crc kubenswrapper[4870]: I0130 08:15:10.338191 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mqxgq" event={"ID":"d7b3d065-5057-49c1-be84-7880d7d4d619","Type":"ContainerStarted","Data":"1acbed91c3bf635499b07298f9a6aadf605b78600fddd206109ebc4db66b0d62"} Jan 30 08:15:10 crc kubenswrapper[4870]: I0130 08:15:10.383211 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mqxgq" podStartSLOduration=1.683953834 podStartE2EDuration="3.383188274s" podCreationTimestamp="2026-01-30 08:15:07 +0000 UTC" firstStartedPulling="2026-01-30 08:15:08.292133046 +0000 UTC m=+346.987680165" lastFinishedPulling="2026-01-30 08:15:09.991367476 +0000 UTC m=+348.686914605" observedRunningTime="2026-01-30 08:15:10.380328034 +0000 UTC m=+349.075875143" watchObservedRunningTime="2026-01-30 08:15:10.383188274 +0000 UTC m=+349.078735403" Jan 30 08:15:11 crc kubenswrapper[4870]: I0130 08:15:11.347591 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whfhw" event={"ID":"ea80eb92-6881-4e69-8ca2-050d32254eb7","Type":"ContainerStarted","Data":"7a73f19a2365c56a7085d2611b8ec6d2b0bc0e74fac6b45cb792e143380218f0"} Jan 30 08:15:11 crc kubenswrapper[4870]: I0130 08:15:11.372150 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whfhw" podStartSLOduration=2.934882493 podStartE2EDuration="5.372122202s" podCreationTimestamp="2026-01-30 08:15:06 +0000 UTC" firstStartedPulling="2026-01-30 08:15:08.295990736 +0000 UTC m=+346.991537865" lastFinishedPulling="2026-01-30 08:15:10.733230475 +0000 UTC m=+349.428777574" observedRunningTime="2026-01-30 08:15:11.366634521 +0000 UTC m=+350.062181640" watchObservedRunningTime="2026-01-30 08:15:11.372122202 +0000 UTC m=+350.067669331" Jan 30 08:15:14 crc kubenswrapper[4870]: I0130 08:15:14.825785 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:14 crc kubenswrapper[4870]: I0130 08:15:14.826627 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:14 crc kubenswrapper[4870]: I0130 08:15:14.884265 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:15 crc kubenswrapper[4870]: I0130 08:15:15.040627 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:15 crc kubenswrapper[4870]: I0130 08:15:15.040744 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:15 crc kubenswrapper[4870]: I0130 08:15:15.442275 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-glxrr" Jan 30 08:15:16 crc kubenswrapper[4870]: I0130 08:15:16.114349 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8dmqx" podUID="71b77216-d7c7-4a69-8596-e64fd99129c6" containerName="registry-server" probeResult="failure" output=< Jan 30 08:15:16 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:15:16 crc kubenswrapper[4870]: > Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.216130 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.216253 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.276623 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.406252 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.406328 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.436467 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whfhw" Jan 30 08:15:17 crc kubenswrapper[4870]: I0130 08:15:17.454969 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:18 crc kubenswrapper[4870]: I0130 08:15:18.449017 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mqxgq" Jan 30 08:15:25 crc kubenswrapper[4870]: I0130 08:15:25.109991 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:25 crc kubenswrapper[4870]: I0130 08:15:25.182378 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8dmqx" Jan 30 08:15:25 crc kubenswrapper[4870]: I0130 08:15:25.249658 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:15:25 crc kubenswrapper[4870]: I0130 08:15:25.249757 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.842760 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vhbz2"] Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.844993 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.864737 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vhbz2"] Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983087 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f3a2e4b-def0-466b-8e43-383345474a2d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983150 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-certificates\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983176 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8m56\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-kube-api-access-n8m56\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983203 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983513 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-tls\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983603 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-trusted-ca\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983689 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f3a2e4b-def0-466b-8e43-383345474a2d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:40 crc kubenswrapper[4870]: I0130 08:15:40.983912 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-bound-sa-token\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.025056 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085296 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-certificates\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085358 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8m56\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-kube-api-access-n8m56\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085419 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-tls\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085444 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-trusted-ca\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085471 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f3a2e4b-def0-466b-8e43-383345474a2d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085499 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-bound-sa-token\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.085526 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f3a2e4b-def0-466b-8e43-383345474a2d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.086040 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f3a2e4b-def0-466b-8e43-383345474a2d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.086742 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-certificates\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.088021 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f3a2e4b-def0-466b-8e43-383345474a2d-trusted-ca\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.093160 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f3a2e4b-def0-466b-8e43-383345474a2d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.093327 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-registry-tls\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.104329 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-bound-sa-token\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.117489 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8m56\" (UniqueName: \"kubernetes.io/projected/5f3a2e4b-def0-466b-8e43-383345474a2d-kube-api-access-n8m56\") pod \"image-registry-66df7c8f76-vhbz2\" (UID: \"5f3a2e4b-def0-466b-8e43-383345474a2d\") " pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.165548 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.409038 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vhbz2"] Jan 30 08:15:41 crc kubenswrapper[4870]: I0130 08:15:41.542557 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" event={"ID":"5f3a2e4b-def0-466b-8e43-383345474a2d","Type":"ContainerStarted","Data":"a90d31a814b86f627a43e25379ef68624e267caa6fd5a7c13fe2c5eeb94ccddd"} Jan 30 08:15:42 crc kubenswrapper[4870]: I0130 08:15:42.551342 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" event={"ID":"5f3a2e4b-def0-466b-8e43-383345474a2d","Type":"ContainerStarted","Data":"16d064f1b3edb8b3bde9ec43d0bc02c68594fa70738a5c1838c6ae297e3c59a9"} Jan 30 08:15:42 crc kubenswrapper[4870]: I0130 08:15:42.551939 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:15:42 crc kubenswrapper[4870]: I0130 08:15:42.582786 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" podStartSLOduration=2.58275959 podStartE2EDuration="2.58275959s" podCreationTimestamp="2026-01-30 08:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:15:42.579846729 +0000 UTC m=+381.275393848" watchObservedRunningTime="2026-01-30 08:15:42.58275959 +0000 UTC m=+381.278306709" Jan 30 08:15:55 crc kubenswrapper[4870]: I0130 08:15:55.250243 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:15:55 crc kubenswrapper[4870]: I0130 08:15:55.251185 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:16:01 crc kubenswrapper[4870]: I0130 08:16:01.174987 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vhbz2" Jan 30 08:16:01 crc kubenswrapper[4870]: I0130 08:16:01.283137 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.249561 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.250433 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.250558 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.252030 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.252186 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12" gracePeriod=600 Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.832090 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12" exitCode=0 Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.832152 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12"} Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.832643 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2"} Jan 30 08:16:25 crc kubenswrapper[4870]: I0130 08:16:25.832678 4870 scope.go:117] "RemoveContainer" containerID="94b47e3ea4a8a9e203f1255aaef680cd59b881d40b2d681901860cf3608c5cc7" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.336143 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" podUID="406fb8be-c783-4ef8-8aae-5430b0226d17" containerName="registry" containerID="cri-o://5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79" gracePeriod=30 Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.716417 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794428 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794476 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794551 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794578 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794601 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794635 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldpq\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.794665 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.796200 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.796315 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"406fb8be-c783-4ef8-8aae-5430b0226d17\" (UID: \"406fb8be-c783-4ef8-8aae-5430b0226d17\") " Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.796346 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.796553 4870 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.796570 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/406fb8be-c783-4ef8-8aae-5430b0226d17-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.803267 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.803458 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.804315 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq" (OuterVolumeSpecName: "kube-api-access-jldpq") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "kube-api-access-jldpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.806257 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.808515 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.817303 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "406fb8be-c783-4ef8-8aae-5430b0226d17" (UID: "406fb8be-c783-4ef8-8aae-5430b0226d17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.850653 4870 generic.go:334] "Generic (PLEG): container finished" podID="406fb8be-c783-4ef8-8aae-5430b0226d17" containerID="5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79" exitCode=0 Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.850699 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" event={"ID":"406fb8be-c783-4ef8-8aae-5430b0226d17","Type":"ContainerDied","Data":"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79"} Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.850727 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" event={"ID":"406fb8be-c783-4ef8-8aae-5430b0226d17","Type":"ContainerDied","Data":"47d84e04f9b3f93637b83fdd855c471e56293ba330cba3caf1369ea3f8340bb4"} Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.850746 4870 scope.go:117] "RemoveContainer" containerID="5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.850751 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sfs65" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.868412 4870 scope.go:117] "RemoveContainer" containerID="5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79" Jan 30 08:16:26 crc kubenswrapper[4870]: E0130 08:16:26.868834 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79\": container with ID starting with 5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79 not found: ID does not exist" containerID="5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.868866 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79"} err="failed to get container status \"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79\": rpc error: code = NotFound desc = could not find container \"5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79\": container with ID starting with 5552b3f0c363ddea2c6329d4466cd116049d2d523ebab1e5ac1b44fd378bdb79 not found: ID does not exist" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.887660 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.892888 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sfs65"] Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.897850 4870 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.897903 4870 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/406fb8be-c783-4ef8-8aae-5430b0226d17-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.897914 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldpq\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-kube-api-access-jldpq\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.897927 4870 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/406fb8be-c783-4ef8-8aae-5430b0226d17-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: I0130 08:16:26.897940 4870 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/406fb8be-c783-4ef8-8aae-5430b0226d17-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 08:16:26 crc kubenswrapper[4870]: E0130 08:16:26.971806 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406fb8be_c783_4ef8_8aae_5430b0226d17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406fb8be_c783_4ef8_8aae_5430b0226d17.slice/crio-47d84e04f9b3f93637b83fdd855c471e56293ba330cba3caf1369ea3f8340bb4\": RecentStats: unable to find data in memory cache]" Jan 30 08:16:28 crc kubenswrapper[4870]: I0130 08:16:28.091173 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406fb8be-c783-4ef8-8aae-5430b0226d17" path="/var/lib/kubelet/pods/406fb8be-c783-4ef8-8aae-5430b0226d17/volumes" Jan 30 08:18:25 crc kubenswrapper[4870]: I0130 08:18:25.249630 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:18:25 crc kubenswrapper[4870]: I0130 08:18:25.250912 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:18:55 crc kubenswrapper[4870]: I0130 08:18:55.249525 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:18:55 crc kubenswrapper[4870]: I0130 08:18:55.250224 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.002766 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ltz5g"] Jan 30 08:19:18 crc kubenswrapper[4870]: E0130 08:19:18.004258 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406fb8be-c783-4ef8-8aae-5430b0226d17" containerName="registry" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.004283 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="406fb8be-c783-4ef8-8aae-5430b0226d17" containerName="registry" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.004454 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="406fb8be-c783-4ef8-8aae-5430b0226d17" containerName="registry" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.005097 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.005611 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ltz5g" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.005772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.008558 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jvp7h" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.009045 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.009362 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.009493 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nsgpw" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.019056 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.022784 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n5xzk"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.023807 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.025420 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ltz5g"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.028053 4870 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6dtkb" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.057726 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n5xzk"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.167591 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6sq4\" (UniqueName: \"kubernetes.io/projected/dfee5a53-cd5a-470f-9327-e614ff6e56b3-kube-api-access-l6sq4\") pod \"cert-manager-858654f9db-ltz5g\" (UID: \"dfee5a53-cd5a-470f-9327-e614ff6e56b3\") " pod="cert-manager/cert-manager-858654f9db-ltz5g" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.167693 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5z8\" (UniqueName: \"kubernetes.io/projected/4e91c0f0-40df-495c-8758-892355565838-kube-api-access-ck5z8\") pod \"cert-manager-cainjector-cf98fcc89-2hzbl\" (UID: \"4e91c0f0-40df-495c-8758-892355565838\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.167934 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mkr\" (UniqueName: \"kubernetes.io/projected/c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1-kube-api-access-d4mkr\") pod \"cert-manager-webhook-687f57d79b-n5xzk\" (UID: \"c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.269859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mkr\" (UniqueName: \"kubernetes.io/projected/c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1-kube-api-access-d4mkr\") pod \"cert-manager-webhook-687f57d79b-n5xzk\" (UID: \"c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.269945 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6sq4\" (UniqueName: \"kubernetes.io/projected/dfee5a53-cd5a-470f-9327-e614ff6e56b3-kube-api-access-l6sq4\") pod \"cert-manager-858654f9db-ltz5g\" (UID: \"dfee5a53-cd5a-470f-9327-e614ff6e56b3\") " pod="cert-manager/cert-manager-858654f9db-ltz5g" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.269978 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5z8\" (UniqueName: \"kubernetes.io/projected/4e91c0f0-40df-495c-8758-892355565838-kube-api-access-ck5z8\") pod \"cert-manager-cainjector-cf98fcc89-2hzbl\" (UID: \"4e91c0f0-40df-495c-8758-892355565838\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.293696 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6sq4\" (UniqueName: \"kubernetes.io/projected/dfee5a53-cd5a-470f-9327-e614ff6e56b3-kube-api-access-l6sq4\") pod \"cert-manager-858654f9db-ltz5g\" (UID: \"dfee5a53-cd5a-470f-9327-e614ff6e56b3\") " pod="cert-manager/cert-manager-858654f9db-ltz5g" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.294076 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mkr\" (UniqueName: \"kubernetes.io/projected/c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1-kube-api-access-d4mkr\") pod \"cert-manager-webhook-687f57d79b-n5xzk\" (UID: \"c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.295546 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5z8\" (UniqueName: \"kubernetes.io/projected/4e91c0f0-40df-495c-8758-892355565838-kube-api-access-ck5z8\") pod \"cert-manager-cainjector-cf98fcc89-2hzbl\" (UID: \"4e91c0f0-40df-495c-8758-892355565838\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.337104 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ltz5g" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.345864 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.357848 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.622834 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.646893 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.682077 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ltz5g"] Jan 30 08:19:18 crc kubenswrapper[4870]: I0130 08:19:18.728054 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-n5xzk"] Jan 30 08:19:19 crc kubenswrapper[4870]: I0130 08:19:19.114002 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" event={"ID":"c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1","Type":"ContainerStarted","Data":"fe888275e7e4c12155b17ef62baef34fceb7a734978a3d054f94765f44fdead3"} Jan 30 08:19:19 crc kubenswrapper[4870]: I0130 08:19:19.116315 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" event={"ID":"4e91c0f0-40df-495c-8758-892355565838","Type":"ContainerStarted","Data":"4224cf87c60b6e8d522910b1a823312155a71a4ec56669d68ff585c5e3020415"} Jan 30 08:19:19 crc kubenswrapper[4870]: I0130 08:19:19.119934 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ltz5g" event={"ID":"dfee5a53-cd5a-470f-9327-e614ff6e56b3","Type":"ContainerStarted","Data":"d85eb2a5e5b6087011864a97717e70327f7890274e9a6b4d024911d6b14fd2a2"} Jan 30 08:19:25 crc kubenswrapper[4870]: I0130 08:19:25.249567 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:19:25 crc kubenswrapper[4870]: I0130 08:19:25.250298 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:19:25 crc kubenswrapper[4870]: I0130 08:19:25.250369 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:19:25 crc kubenswrapper[4870]: I0130 08:19:25.251160 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:19:25 crc kubenswrapper[4870]: I0130 08:19:25.251250 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2" gracePeriod=600 Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.182464 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" event={"ID":"4e91c0f0-40df-495c-8758-892355565838","Type":"ContainerStarted","Data":"b30160def04b3532801e1abfaad294e0d342a8fb215be5963b2b08f7a4506818"} Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.188536 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2" exitCode=0 Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.188614 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2"} Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.188669 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be"} Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.188704 4870 scope.go:117] "RemoveContainer" containerID="a02ea1737ed88e21ae8883a3a6a22392b0695152f4ace29771521a0445381b12" Jan 30 08:19:26 crc kubenswrapper[4870]: I0130 08:19:26.239446 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-2hzbl" podStartSLOduration=2.576752582 podStartE2EDuration="9.2394237s" podCreationTimestamp="2026-01-30 08:19:17 +0000 UTC" firstStartedPulling="2026-01-30 08:19:18.646647213 +0000 UTC m=+597.342194322" lastFinishedPulling="2026-01-30 08:19:25.309318331 +0000 UTC m=+604.004865440" observedRunningTime="2026-01-30 08:19:26.204098849 +0000 UTC m=+604.899645978" watchObservedRunningTime="2026-01-30 08:19:26.2394237 +0000 UTC m=+604.934970809" Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.678063 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cj5db"] Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679086 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-controller" containerID="cri-o://575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679157 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="sbdb" containerID="cri-o://8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679195 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="northd" containerID="cri-o://a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679304 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679293 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-node" containerID="cri-o://c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679350 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-acl-logging" containerID="cri-o://0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.679294 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="nbdb" containerID="cri-o://0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165" gracePeriod=30 Jan 30 08:19:27 crc kubenswrapper[4870]: I0130 08:19:27.756753 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" containerID="cri-o://b741960d899fead07c73e8ea4b750a10bd019b223fe9d09e7a67a573f3e4bee3" gracePeriod=30 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.202040 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovnkube-controller/3.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.204971 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-acl-logging/0.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.205661 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-controller/0.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206056 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="b741960d899fead07c73e8ea4b750a10bd019b223fe9d09e7a67a573f3e4bee3" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206084 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206093 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206102 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206112 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206121 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad" exitCode=0 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206129 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71" exitCode=143 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206137 4870 generic.go:334] "Generic (PLEG): container finished" podID="36037609-52f9-4c09-8beb-6d35a039347b" containerID="575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7" exitCode=143 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206204 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"b741960d899fead07c73e8ea4b750a10bd019b223fe9d09e7a67a573f3e4bee3"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206283 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206314 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206332 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206354 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206378 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206397 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206403 4870 scope.go:117] "RemoveContainer" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.206415 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.208234 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/2.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.208689 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/1.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.208728 4870 generic.go:334] "Generic (PLEG): container finished" podID="3e8e9e25-2b9b-4820-8282-48e1d930a721" containerID="61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41" exitCode=2 Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.208781 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerDied","Data":"61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41"} Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.209380 4870 scope.go:117] "RemoveContainer" containerID="61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.209567 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hsmrb_openshift-multus(3e8e9e25-2b9b-4820-8282-48e1d930a721)\"" pod="openshift-multus/multus-hsmrb" podUID="3e8e9e25-2b9b-4820-8282-48e1d930a721" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.284298 4870 scope.go:117] "RemoveContainer" containerID="e3a6e35394d201b4791302c67100d562af8287400f9c84b312e22704a65348d6" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.284431 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258\": container with ID starting with 1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258 not found: ID does not exist" containerID="1cfc68484a864e957ca34c89143ab2d216ee6d034da9aa1502a74f8358dc2258" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.288742 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-acl-logging/0.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.289359 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-controller/0.log" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.289860 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335050 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335142 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335170 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335200 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335207 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335260 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335296 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335292 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335318 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5ps\" (UniqueName: \"kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335342 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335345 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335362 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335387 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335387 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335404 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335427 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335425 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335464 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335481 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335506 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335534 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335569 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335592 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335612 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.335654 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet\") pod \"36037609-52f9-4c09-8beb-6d35a039347b\" (UID: \"36037609-52f9-4c09-8beb-6d35a039347b\") " Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336115 4870 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336133 4870 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336145 4870 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336178 4870 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336195 4870 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336319 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336922 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.336984 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.337112 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket" (OuterVolumeSpecName: "log-socket") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.337207 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash" (OuterVolumeSpecName: "host-slash") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.337847 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.338059 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.338103 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.338722 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.338778 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log" (OuterVolumeSpecName: "node-log") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.338818 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.339625 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.343603 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.344404 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps" (OuterVolumeSpecName: "kube-api-access-pk5ps") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "kube-api-access-pk5ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.362441 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "36037609-52f9-4c09-8beb-6d35a039347b" (UID: "36037609-52f9-4c09-8beb-6d35a039347b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.365542 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nc7ds"] Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.365998 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366029 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366049 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366065 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366210 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-node" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366228 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-node" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366242 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-acl-logging" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366254 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-acl-logging" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366271 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="northd" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366288 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="northd" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366313 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="nbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366329 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="nbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366420 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="sbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366442 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="sbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366464 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366477 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366507 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366520 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366546 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kubecfg-setup" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366560 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kubecfg-setup" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.366580 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366593 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366779 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366807 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366825 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="northd" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366841 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-acl-logging" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366854 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="kube-rbac-proxy-node" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366869 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366911 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="nbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366925 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366946 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366963 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="sbdb" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.366979 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovn-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.367147 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.367161 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: E0130 08:19:28.367188 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.367202 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.367394 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="36037609-52f9-4c09-8beb-6d35a039347b" containerName="ovnkube-controller" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.379796 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.437800 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-systemd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.437906 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-node-log\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.437953 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-kubelet\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.437996 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438035 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-etc-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438073 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-env-overrides\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438108 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/44b4d931-dba3-441a-aa46-ab54a5a6603d-kube-api-access-659lx\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438146 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-netd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438181 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-slash\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438216 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-ovn\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438261 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-log-socket\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438319 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-var-lib-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438358 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovn-node-metrics-cert\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438398 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-bin\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438432 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-script-lib\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438627 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-config\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438733 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-netns\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438766 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-systemd-units\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438797 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.438992 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36037609-52f9-4c09-8beb-6d35a039347b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439020 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5ps\" (UniqueName: \"kubernetes.io/projected/36037609-52f9-4c09-8beb-6d35a039347b-kube-api-access-pk5ps\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439031 4870 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439044 4870 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439054 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439063 4870 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439075 4870 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439085 4870 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439093 4870 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439104 4870 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36037609-52f9-4c09-8beb-6d35a039347b-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439114 4870 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439127 4870 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439138 4870 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439148 4870 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.439160 4870 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36037609-52f9-4c09-8beb-6d35a039347b-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.540900 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.540971 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541038 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-systemd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541071 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-node-log\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541106 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-kubelet\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541127 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541203 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541196 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541264 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-kubelet\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541139 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541406 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-etc-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541234 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-node-log\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541456 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-env-overrides\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541493 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/44b4d931-dba3-441a-aa46-ab54a5a6603d-kube-api-access-659lx\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541534 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-netd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541569 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-slash\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541617 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-ovn\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541517 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-etc-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541670 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-netd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541676 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-log-socket\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541737 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-slash\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541753 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-ovn\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541720 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-log-socket\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541835 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-run-systemd\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.541946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-var-lib-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542011 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovn-node-metrics-cert\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542025 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-env-overrides\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542070 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-bin\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542086 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-var-lib-openvswitch\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542122 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-script-lib\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542155 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-cni-bin\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542259 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-config\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542328 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-netns\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542368 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-systemd-units\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542390 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-host-run-netns\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542495 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44b4d931-dba3-441a-aa46-ab54a5a6603d-systemd-units\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.542823 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-config\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.543390 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovnkube-script-lib\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.547307 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44b4d931-dba3-441a-aa46-ab54a5a6603d-ovn-node-metrics-cert\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.560003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/44b4d931-dba3-441a-aa46-ab54a5a6603d-kube-api-access-659lx\") pod \"ovnkube-node-nc7ds\" (UID: \"44b4d931-dba3-441a-aa46-ab54a5a6603d\") " pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: I0130 08:19:28.716109 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:28 crc kubenswrapper[4870]: W0130 08:19:28.738553 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b4d931_dba3_441a_aa46_ab54a5a6603d.slice/crio-c764fac3319c45e8fab3bd66ae124d5162255e7389a855b96561e7b8c502f5a1 WatchSource:0}: Error finding container c764fac3319c45e8fab3bd66ae124d5162255e7389a855b96561e7b8c502f5a1: Status 404 returned error can't find the container with id c764fac3319c45e8fab3bd66ae124d5162255e7389a855b96561e7b8c502f5a1 Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.221346 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ltz5g" event={"ID":"dfee5a53-cd5a-470f-9327-e614ff6e56b3","Type":"ContainerStarted","Data":"28f19356301fe81e1c9f648ceed5619c54a4f33fddb3408b906af0ce16fe258b"} Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.223676 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/2.log" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.225994 4870 generic.go:334] "Generic (PLEG): container finished" podID="44b4d931-dba3-441a-aa46-ab54a5a6603d" containerID="f15a4ec572846e0d230cb2f311da4392186ca82166c81544edb20931b290fd3c" exitCode=0 Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.226060 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerDied","Data":"f15a4ec572846e0d230cb2f311da4392186ca82166c81544edb20931b290fd3c"} Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.226098 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"c764fac3319c45e8fab3bd66ae124d5162255e7389a855b96561e7b8c502f5a1"} Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.228258 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" event={"ID":"c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1","Type":"ContainerStarted","Data":"c1ef29374d5e8e38bdd15ff8b4858697608417fcb9b52c53252656fdfeee7e74"} Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.228349 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.237507 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-acl-logging/0.log" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.238734 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cj5db_36037609-52f9-4c09-8beb-6d35a039347b/ovn-controller/0.log" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.239759 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" event={"ID":"36037609-52f9-4c09-8beb-6d35a039347b","Type":"ContainerDied","Data":"cb12165531731a176212e5ceb871fbc54aec2538e3ad27d93d5c0438cf177aa7"} Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.239888 4870 scope.go:117] "RemoveContainer" containerID="b741960d899fead07c73e8ea4b750a10bd019b223fe9d09e7a67a573f3e4bee3" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.240598 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cj5db" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.251309 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ltz5g" podStartSLOduration=2.703967012 podStartE2EDuration="12.251274521s" podCreationTimestamp="2026-01-30 08:19:17 +0000 UTC" firstStartedPulling="2026-01-30 08:19:18.701471766 +0000 UTC m=+597.397018875" lastFinishedPulling="2026-01-30 08:19:28.248779235 +0000 UTC m=+606.944326384" observedRunningTime="2026-01-30 08:19:29.250444064 +0000 UTC m=+607.945991213" watchObservedRunningTime="2026-01-30 08:19:29.251274521 +0000 UTC m=+607.946821680" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.282416 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" podStartSLOduration=1.760761677 podStartE2EDuration="11.282392789s" podCreationTimestamp="2026-01-30 08:19:18 +0000 UTC" firstStartedPulling="2026-01-30 08:19:18.736012902 +0000 UTC m=+597.431560011" lastFinishedPulling="2026-01-30 08:19:28.257643974 +0000 UTC m=+606.953191123" observedRunningTime="2026-01-30 08:19:29.282081849 +0000 UTC m=+607.977628978" watchObservedRunningTime="2026-01-30 08:19:29.282392789 +0000 UTC m=+607.977939898" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.291236 4870 scope.go:117] "RemoveContainer" containerID="8b69c6f4703f6562a40a39ae67345d13bdd12f5ad857b2d82c688045680ec959" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.345123 4870 scope.go:117] "RemoveContainer" containerID="0d4f57f11509c53c37fac0cd78b5e66a0fe1f0cf9d9e871d47a8e2d1d8b5c165" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.370577 4870 scope.go:117] "RemoveContainer" containerID="a1167f51b3289cd09b28e12e21b8d070ec42207704552e03669e4165dffcb03c" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.391057 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cj5db"] Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.396078 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cj5db"] Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.398095 4870 scope.go:117] "RemoveContainer" containerID="1df1ddb13d52250ced37d12ff2dc768bf24b77be6e6683427491bb47fa499499" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.417640 4870 scope.go:117] "RemoveContainer" containerID="c504ee48b14b937f75d9b9d9947b033d5a20e8d4d53e0f7fd5eaa22ed33cb1ad" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.446415 4870 scope.go:117] "RemoveContainer" containerID="0f6e737c741e80dcd81c80f594e37faa40894808a16895b315bd4862668e7b71" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.469329 4870 scope.go:117] "RemoveContainer" containerID="575202e2c2be0f5e4a7449bfc2f2ef700313987a24a19bc751655bb2c2d118e7" Jan 30 08:19:29 crc kubenswrapper[4870]: I0130 08:19:29.505590 4870 scope.go:117] "RemoveContainer" containerID="daefd50679d10beed7fe7a9014c1cb34525b08c2c5b9cf04311684497572745f" Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.084735 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36037609-52f9-4c09-8beb-6d35a039347b" path="/var/lib/kubelet/pods/36037609-52f9-4c09-8beb-6d35a039347b/volumes" Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250460 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"d68bc46fcb018da0aa44c01d649785d6ad2936d440430360e0bfb028bbbae3b0"} Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"c2cd7e88e5ea05c8fb49e55098077a9c4e91f0fa015d1a824f4c3f93cd1bedc1"} Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250539 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"89b9f7514af43afbb2865d7ed723d8d3ff70dd47c3039800c9fd873f0870452e"} Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250553 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"ad686efbb85ea3f7d8a9527f5f9cf65b6c804dacaea14aac6fe94243e7da8c6e"} Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250568 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"b4477771c9b4eb0f7e0362a33eb22c363d86df4400d3316a77e98a452ae9c217"} Jan 30 08:19:30 crc kubenswrapper[4870]: I0130 08:19:30.250581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"d0d810de670c87161bc5f7c00601ff9ac02f8b3cd7857ca221772943c4c4b1bb"} Jan 30 08:19:33 crc kubenswrapper[4870]: I0130 08:19:33.279733 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"14fe3209a9ecb2c269531a1fb740f662f5c47a802f08342dabcfea6702039171"} Jan 30 08:19:33 crc kubenswrapper[4870]: I0130 08:19:33.362724 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-n5xzk" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.299209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" event={"ID":"44b4d931-dba3-441a-aa46-ab54a5a6603d","Type":"ContainerStarted","Data":"066b230d174830f8f95f81ff2297f806308b4d299c2724a83e328a4721879fa2"} Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.299726 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.299775 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.299789 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.330202 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" podStartSLOduration=7.330182947 podStartE2EDuration="7.330182947s" podCreationTimestamp="2026-01-30 08:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:19:35.325488739 +0000 UTC m=+614.021035868" watchObservedRunningTime="2026-01-30 08:19:35.330182947 +0000 UTC m=+614.025730056" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.333138 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:35 crc kubenswrapper[4870]: I0130 08:19:35.333647 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:19:39 crc kubenswrapper[4870]: I0130 08:19:39.075144 4870 scope.go:117] "RemoveContainer" containerID="61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41" Jan 30 08:19:39 crc kubenswrapper[4870]: E0130 08:19:39.076301 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hsmrb_openshift-multus(3e8e9e25-2b9b-4820-8282-48e1d930a721)\"" pod="openshift-multus/multus-hsmrb" podUID="3e8e9e25-2b9b-4820-8282-48e1d930a721" Jan 30 08:19:50 crc kubenswrapper[4870]: I0130 08:19:50.074685 4870 scope.go:117] "RemoveContainer" containerID="61538cdbec39ead4232db7d69f7b41605b3dfdb222395b4c93251e6ded8b3e41" Jan 30 08:19:50 crc kubenswrapper[4870]: I0130 08:19:50.417138 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hsmrb_3e8e9e25-2b9b-4820-8282-48e1d930a721/kube-multus/2.log" Jan 30 08:19:50 crc kubenswrapper[4870]: I0130 08:19:50.417537 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hsmrb" event={"ID":"3e8e9e25-2b9b-4820-8282-48e1d930a721","Type":"ContainerStarted","Data":"a5c2960bb0e1aa6565d35611f4225e3dc7b6fdd6bce853738ed6f884200ad264"} Jan 30 08:19:58 crc kubenswrapper[4870]: I0130 08:19:58.747941 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nc7ds" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.582144 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m"] Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.584197 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.587340 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.604081 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m"] Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.755396 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.755481 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htwp\" (UniqueName: \"kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.755559 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.856395 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.856469 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htwp\" (UniqueName: \"kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.856523 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.857053 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.857233 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.889712 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htwp\" (UniqueName: \"kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:08 crc kubenswrapper[4870]: I0130 08:20:08.902741 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:09 crc kubenswrapper[4870]: I0130 08:20:09.207497 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m"] Jan 30 08:20:09 crc kubenswrapper[4870]: I0130 08:20:09.541630 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerStarted","Data":"4a3e158ce6ac7c5a2feaf02f10ff033257f4c33e39f21539dcbaff9607aa0dfd"} Jan 30 08:20:09 crc kubenswrapper[4870]: I0130 08:20:09.541706 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerStarted","Data":"75e1968f6545046cae56f05b6e9eb3d512ef14cce74ed0927ac9f89dd3e78524"} Jan 30 08:20:10 crc kubenswrapper[4870]: I0130 08:20:10.553252 4870 generic.go:334] "Generic (PLEG): container finished" podID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerID="4a3e158ce6ac7c5a2feaf02f10ff033257f4c33e39f21539dcbaff9607aa0dfd" exitCode=0 Jan 30 08:20:10 crc kubenswrapper[4870]: I0130 08:20:10.553301 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerDied","Data":"4a3e158ce6ac7c5a2feaf02f10ff033257f4c33e39f21539dcbaff9607aa0dfd"} Jan 30 08:20:12 crc kubenswrapper[4870]: I0130 08:20:12.570310 4870 generic.go:334] "Generic (PLEG): container finished" podID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerID="9feacfa1b0a9c420e41b6a6c567f4b1e361ca798b3e991ab07989f5ced3f5d37" exitCode=0 Jan 30 08:20:12 crc kubenswrapper[4870]: I0130 08:20:12.570392 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerDied","Data":"9feacfa1b0a9c420e41b6a6c567f4b1e361ca798b3e991ab07989f5ced3f5d37"} Jan 30 08:20:13 crc kubenswrapper[4870]: I0130 08:20:13.582724 4870 generic.go:334] "Generic (PLEG): container finished" podID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerID="11e579aceb4c6af641442fbe030cf62d99a32ed13743bd76897395330bfbe6c5" exitCode=0 Jan 30 08:20:13 crc kubenswrapper[4870]: I0130 08:20:13.582837 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerDied","Data":"11e579aceb4c6af641442fbe030cf62d99a32ed13743bd76897395330bfbe6c5"} Jan 30 08:20:14 crc kubenswrapper[4870]: I0130 08:20:14.887552 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:14 crc kubenswrapper[4870]: I0130 08:20:14.942049 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle\") pod \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " Jan 30 08:20:14 crc kubenswrapper[4870]: I0130 08:20:14.947007 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle" (OuterVolumeSpecName: "bundle") pod "69895f16-2797-4fd7-aedf-54fc47cd2c4f" (UID: "69895f16-2797-4fd7-aedf-54fc47cd2c4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.043182 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util\") pod \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.043320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htwp\" (UniqueName: \"kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp\") pod \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\" (UID: \"69895f16-2797-4fd7-aedf-54fc47cd2c4f\") " Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.043778 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.052329 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp" (OuterVolumeSpecName: "kube-api-access-5htwp") pod "69895f16-2797-4fd7-aedf-54fc47cd2c4f" (UID: "69895f16-2797-4fd7-aedf-54fc47cd2c4f"). InnerVolumeSpecName "kube-api-access-5htwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.129811 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util" (OuterVolumeSpecName: "util") pod "69895f16-2797-4fd7-aedf-54fc47cd2c4f" (UID: "69895f16-2797-4fd7-aedf-54fc47cd2c4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.146007 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69895f16-2797-4fd7-aedf-54fc47cd2c4f-util\") on node \"crc\" DevicePath \"\"" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.146042 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htwp\" (UniqueName: \"kubernetes.io/projected/69895f16-2797-4fd7-aedf-54fc47cd2c4f-kube-api-access-5htwp\") on node \"crc\" DevicePath \"\"" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.602943 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" event={"ID":"69895f16-2797-4fd7-aedf-54fc47cd2c4f","Type":"ContainerDied","Data":"75e1968f6545046cae56f05b6e9eb3d512ef14cce74ed0927ac9f89dd3e78524"} Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.603027 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e1968f6545046cae56f05b6e9eb3d512ef14cce74ed0927ac9f89dd3e78524" Jan 30 08:20:15 crc kubenswrapper[4870]: I0130 08:20:15.603072 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.673823 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn"] Jan 30 08:20:26 crc kubenswrapper[4870]: E0130 08:20:26.674923 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="util" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.674942 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="util" Jan 30 08:20:26 crc kubenswrapper[4870]: E0130 08:20:26.674958 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="pull" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.674968 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="pull" Jan 30 08:20:26 crc kubenswrapper[4870]: E0130 08:20:26.674985 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="extract" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.674994 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="extract" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.675148 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="69895f16-2797-4fd7-aedf-54fc47cd2c4f" containerName="extract" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.675693 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.678679 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-lnpzq" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.678857 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.678970 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.685847 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn"] Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.739857 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579ms\" (UniqueName: \"kubernetes.io/projected/614f63fc-ed66-41bb-b9fe-4229b3b67f50-kube-api-access-579ms\") pod \"obo-prometheus-operator-68bc856cb9-hj2pn\" (UID: \"614f63fc-ed66-41bb-b9fe-4229b3b67f50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.806179 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8"] Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.807056 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.809661 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.809930 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-b5jzw" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.810708 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf"] Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.811818 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.833137 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8"] Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.836464 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf"] Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.845053 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.845324 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.845439 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579ms\" (UniqueName: \"kubernetes.io/projected/614f63fc-ed66-41bb-b9fe-4229b3b67f50-kube-api-access-579ms\") pod \"obo-prometheus-operator-68bc856cb9-hj2pn\" (UID: \"614f63fc-ed66-41bb-b9fe-4229b3b67f50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.845509 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.845578 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.894039 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579ms\" (UniqueName: \"kubernetes.io/projected/614f63fc-ed66-41bb-b9fe-4229b3b67f50-kube-api-access-579ms\") pod \"obo-prometheus-operator-68bc856cb9-hj2pn\" (UID: \"614f63fc-ed66-41bb-b9fe-4229b3b67f50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.947610 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.948162 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.948478 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.948697 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.953573 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.953677 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b8b459d-7a00-4e96-8916-4edd9fc87b99-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf\" (UID: \"1b8b459d-7a00-4e96-8916-4edd9fc87b99\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.958489 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.959951 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/586011b7-bc23-4a41-8795-bc28910cd170-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bf5558b74-9clj8\" (UID: \"586011b7-bc23-4a41-8795-bc28910cd170\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:26 crc kubenswrapper[4870]: I0130 08:20:26.991033 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.042011 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lv4dk"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.043088 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.050896 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gnlcv" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.050961 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.051945 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gqct\" (UniqueName: \"kubernetes.io/projected/0f7d84eb-b450-4168-b207-22520fed3fd3-kube-api-access-7gqct\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.052058 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f7d84eb-b450-4168-b207-22520fed3fd3-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.056380 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lv4dk"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.131421 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.131434 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.155631 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f7d84eb-b450-4168-b207-22520fed3fd3-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.155756 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gqct\" (UniqueName: \"kubernetes.io/projected/0f7d84eb-b450-4168-b207-22520fed3fd3-kube-api-access-7gqct\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.163742 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f7d84eb-b450-4168-b207-22520fed3fd3-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.173828 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gqct\" (UniqueName: \"kubernetes.io/projected/0f7d84eb-b450-4168-b207-22520fed3fd3-kube-api-access-7gqct\") pod \"observability-operator-59bdc8b94-lv4dk\" (UID: \"0f7d84eb-b450-4168-b207-22520fed3fd3\") " pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.220628 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tmzq2"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.221318 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.228801 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vndv4" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.240536 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tmzq2"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.254980 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.259298 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/962cb597-f461-4983-b37a-a4c9e545f7d8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.259409 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjs2\" (UniqueName: \"kubernetes.io/projected/962cb597-f461-4983-b37a-a4c9e545f7d8-kube-api-access-8sjs2\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.363135 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjs2\" (UniqueName: \"kubernetes.io/projected/962cb597-f461-4983-b37a-a4c9e545f7d8-kube-api-access-8sjs2\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.363197 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/962cb597-f461-4983-b37a-a4c9e545f7d8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.364276 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/962cb597-f461-4983-b37a-a4c9e545f7d8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.367622 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.398775 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjs2\" (UniqueName: \"kubernetes.io/projected/962cb597-f461-4983-b37a-a4c9e545f7d8-kube-api-access-8sjs2\") pod \"perses-operator-5bf474d74f-tmzq2\" (UID: \"962cb597-f461-4983-b37a-a4c9e545f7d8\") " pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.476508 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.510034 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.569447 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.687085 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" event={"ID":"586011b7-bc23-4a41-8795-bc28910cd170","Type":"ContainerStarted","Data":"0718e3efba7ab00d61089aa48f0234f3c19944124782bf6dc48db1ceb5ea4dbe"} Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.712087 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" event={"ID":"614f63fc-ed66-41bb-b9fe-4229b3b67f50","Type":"ContainerStarted","Data":"39d61e2b9f773d01c9ac965557ad0282920802eea08954910d03acb92bfd9278"} Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.725183 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" event={"ID":"1b8b459d-7a00-4e96-8916-4edd9fc87b99","Type":"ContainerStarted","Data":"f6d600f3277865fd9bfbffd570e8b09f94ff53d2fa9a50a6306e8c32eb833e37"} Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.817272 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tmzq2"] Jan 30 08:20:27 crc kubenswrapper[4870]: I0130 08:20:27.885192 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lv4dk"] Jan 30 08:20:28 crc kubenswrapper[4870]: I0130 08:20:28.733729 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" event={"ID":"962cb597-f461-4983-b37a-a4c9e545f7d8","Type":"ContainerStarted","Data":"2dda537d4e989c9630a8cd6927262ab32ecf60c780355e155a6feb5801476d8b"} Jan 30 08:20:28 crc kubenswrapper[4870]: I0130 08:20:28.743068 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" event={"ID":"0f7d84eb-b450-4168-b207-22520fed3fd3","Type":"ContainerStarted","Data":"22f5371384e5321a1545407fc0590c38bbba07f5d5511f5d3e4e9bce45c3206d"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.812751 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" event={"ID":"1b8b459d-7a00-4e96-8916-4edd9fc87b99","Type":"ContainerStarted","Data":"78e25f2b510934b0bd5b3ada7089ca22e2a2652e8e7bfb1e8e81bc64e2854433"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.815055 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" event={"ID":"586011b7-bc23-4a41-8795-bc28910cd170","Type":"ContainerStarted","Data":"b59ce053c8082cc1e42cb72f91c12093af2d246d90831d6513de727e82306c22"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.819896 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" event={"ID":"0f7d84eb-b450-4168-b207-22520fed3fd3","Type":"ContainerStarted","Data":"e0ef739332fd43c194106bab2d52cad33b9271ab639540e37296597585d9c784"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.820869 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.829412 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" event={"ID":"614f63fc-ed66-41bb-b9fe-4229b3b67f50","Type":"ContainerStarted","Data":"0261bbf16a602c1f35f8660cff758fe3f9e9042e3b4752cc140bd7783bb703b7"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.831421 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" event={"ID":"962cb597-f461-4983-b37a-a4c9e545f7d8","Type":"ContainerStarted","Data":"c5aea41b24bbfd35dbc306c65a3e8681a75ae912988702b88a461d8133d9cbfa"} Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.831754 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.839739 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.854576 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf" podStartSLOduration=2.336359935 podStartE2EDuration="12.854555995s" podCreationTimestamp="2026-01-30 08:20:26 +0000 UTC" firstStartedPulling="2026-01-30 08:20:27.569394644 +0000 UTC m=+666.264941753" lastFinishedPulling="2026-01-30 08:20:38.087590704 +0000 UTC m=+676.783137813" observedRunningTime="2026-01-30 08:20:38.848134523 +0000 UTC m=+677.543681632" watchObservedRunningTime="2026-01-30 08:20:38.854555995 +0000 UTC m=+677.550103104" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.916324 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-hj2pn" podStartSLOduration=2.117184174 podStartE2EDuration="12.916302126s" podCreationTimestamp="2026-01-30 08:20:26 +0000 UTC" firstStartedPulling="2026-01-30 08:20:27.288382299 +0000 UTC m=+665.983929408" lastFinishedPulling="2026-01-30 08:20:38.087500261 +0000 UTC m=+676.783047360" observedRunningTime="2026-01-30 08:20:38.915006855 +0000 UTC m=+677.610553994" watchObservedRunningTime="2026-01-30 08:20:38.916302126 +0000 UTC m=+677.611849235" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.920718 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bf5558b74-9clj8" podStartSLOduration=2.323220791 podStartE2EDuration="12.920704354s" podCreationTimestamp="2026-01-30 08:20:26 +0000 UTC" firstStartedPulling="2026-01-30 08:20:27.511118761 +0000 UTC m=+666.206665870" lastFinishedPulling="2026-01-30 08:20:38.108602314 +0000 UTC m=+676.804149433" observedRunningTime="2026-01-30 08:20:38.897103271 +0000 UTC m=+677.592650390" watchObservedRunningTime="2026-01-30 08:20:38.920704354 +0000 UTC m=+677.616251463" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.949829 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" podStartSLOduration=1.686725998 podStartE2EDuration="11.949814059s" podCreationTimestamp="2026-01-30 08:20:27 +0000 UTC" firstStartedPulling="2026-01-30 08:20:27.845026398 +0000 UTC m=+666.540573507" lastFinishedPulling="2026-01-30 08:20:38.108114449 +0000 UTC m=+676.803661568" observedRunningTime="2026-01-30 08:20:38.947477955 +0000 UTC m=+677.643025064" watchObservedRunningTime="2026-01-30 08:20:38.949814059 +0000 UTC m=+677.645361168" Jan 30 08:20:38 crc kubenswrapper[4870]: I0130 08:20:38.970956 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-lv4dk" podStartSLOduration=2.7014626809999998 podStartE2EDuration="12.970932703s" podCreationTimestamp="2026-01-30 08:20:26 +0000 UTC" firstStartedPulling="2026-01-30 08:20:27.897689843 +0000 UTC m=+666.593236952" lastFinishedPulling="2026-01-30 08:20:38.167159865 +0000 UTC m=+676.862706974" observedRunningTime="2026-01-30 08:20:38.967113672 +0000 UTC m=+677.662660791" watchObservedRunningTime="2026-01-30 08:20:38.970932703 +0000 UTC m=+677.666479812" Jan 30 08:20:47 crc kubenswrapper[4870]: I0130 08:20:47.572306 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tmzq2" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.469136 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m"] Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.470688 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.472126 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.482612 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m"] Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.610971 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.611022 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476gr\" (UniqueName: \"kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.611069 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.712920 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.713339 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.713461 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.713639 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-476gr\" (UniqueName: \"kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.714057 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.744047 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-476gr\" (UniqueName: \"kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:06 crc kubenswrapper[4870]: I0130 08:21:06.787475 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:07 crc kubenswrapper[4870]: I0130 08:21:07.205539 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m"] Jan 30 08:21:08 crc kubenswrapper[4870]: I0130 08:21:08.072118 4870 generic.go:334] "Generic (PLEG): container finished" podID="e702b53f-5799-4595-b78f-35717f81379f" containerID="c46a8edcce2da87cc92f64727d7837f9580526bbaf0396b8eec4ad0aa5f7fb93" exitCode=0 Jan 30 08:21:08 crc kubenswrapper[4870]: I0130 08:21:08.072202 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" event={"ID":"e702b53f-5799-4595-b78f-35717f81379f","Type":"ContainerDied","Data":"c46a8edcce2da87cc92f64727d7837f9580526bbaf0396b8eec4ad0aa5f7fb93"} Jan 30 08:21:08 crc kubenswrapper[4870]: I0130 08:21:08.072532 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" event={"ID":"e702b53f-5799-4595-b78f-35717f81379f","Type":"ContainerStarted","Data":"d70c233e62faf05cebf8fa034b0cebb431e07a2724119a9ba26f415c197ffaf7"} Jan 30 08:21:10 crc kubenswrapper[4870]: I0130 08:21:10.086357 4870 generic.go:334] "Generic (PLEG): container finished" podID="e702b53f-5799-4595-b78f-35717f81379f" containerID="fccb5b271e00dc4a065ddb224096d7eeb02d6ecf4f0e199b42da8ec5a3715ff8" exitCode=0 Jan 30 08:21:10 crc kubenswrapper[4870]: I0130 08:21:10.087622 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" event={"ID":"e702b53f-5799-4595-b78f-35717f81379f","Type":"ContainerDied","Data":"fccb5b271e00dc4a065ddb224096d7eeb02d6ecf4f0e199b42da8ec5a3715ff8"} Jan 30 08:21:11 crc kubenswrapper[4870]: I0130 08:21:11.095463 4870 generic.go:334] "Generic (PLEG): container finished" podID="e702b53f-5799-4595-b78f-35717f81379f" containerID="e4448b4f1e874ff8ed7d10c11ffe633ce3a9be9f2572f7e289aa86cd17295676" exitCode=0 Jan 30 08:21:11 crc kubenswrapper[4870]: I0130 08:21:11.095518 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" event={"ID":"e702b53f-5799-4595-b78f-35717f81379f","Type":"ContainerDied","Data":"e4448b4f1e874ff8ed7d10c11ffe633ce3a9be9f2572f7e289aa86cd17295676"} Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.437651 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.626980 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle\") pod \"e702b53f-5799-4595-b78f-35717f81379f\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.627043 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476gr\" (UniqueName: \"kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr\") pod \"e702b53f-5799-4595-b78f-35717f81379f\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.627091 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util\") pod \"e702b53f-5799-4595-b78f-35717f81379f\" (UID: \"e702b53f-5799-4595-b78f-35717f81379f\") " Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.628297 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle" (OuterVolumeSpecName: "bundle") pod "e702b53f-5799-4595-b78f-35717f81379f" (UID: "e702b53f-5799-4595-b78f-35717f81379f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.638160 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr" (OuterVolumeSpecName: "kube-api-access-476gr") pod "e702b53f-5799-4595-b78f-35717f81379f" (UID: "e702b53f-5799-4595-b78f-35717f81379f"). InnerVolumeSpecName "kube-api-access-476gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.728540 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.728587 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-476gr\" (UniqueName: \"kubernetes.io/projected/e702b53f-5799-4595-b78f-35717f81379f-kube-api-access-476gr\") on node \"crc\" DevicePath \"\"" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.921483 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util" (OuterVolumeSpecName: "util") pod "e702b53f-5799-4595-b78f-35717f81379f" (UID: "e702b53f-5799-4595-b78f-35717f81379f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:21:12 crc kubenswrapper[4870]: I0130 08:21:12.930917 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e702b53f-5799-4595-b78f-35717f81379f-util\") on node \"crc\" DevicePath \"\"" Jan 30 08:21:13 crc kubenswrapper[4870]: I0130 08:21:13.127647 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" event={"ID":"e702b53f-5799-4595-b78f-35717f81379f","Type":"ContainerDied","Data":"d70c233e62faf05cebf8fa034b0cebb431e07a2724119a9ba26f415c197ffaf7"} Jan 30 08:21:13 crc kubenswrapper[4870]: I0130 08:21:13.127714 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70c233e62faf05cebf8fa034b0cebb431e07a2724119a9ba26f415c197ffaf7" Jan 30 08:21:13 crc kubenswrapper[4870]: I0130 08:21:13.127773 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.915763 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-sf8qk"] Jan 30 08:21:17 crc kubenswrapper[4870]: E0130 08:21:17.916603 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="pull" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.916620 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="pull" Jan 30 08:21:17 crc kubenswrapper[4870]: E0130 08:21:17.916630 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="util" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.916639 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="util" Jan 30 08:21:17 crc kubenswrapper[4870]: E0130 08:21:17.916652 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="extract" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.916661 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="extract" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.916797 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e702b53f-5799-4595-b78f-35717f81379f" containerName="extract" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.917341 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.919498 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.919578 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.920153 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jpb8x" Jan 30 08:21:17 crc kubenswrapper[4870]: I0130 08:21:17.929312 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-sf8qk"] Jan 30 08:21:18 crc kubenswrapper[4870]: I0130 08:21:18.113849 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s7dq\" (UniqueName: \"kubernetes.io/projected/bdb3e88d-691c-478c-ab03-cc84b8e04ea6-kube-api-access-7s7dq\") pod \"nmstate-operator-646758c888-sf8qk\" (UID: \"bdb3e88d-691c-478c-ab03-cc84b8e04ea6\") " pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" Jan 30 08:21:18 crc kubenswrapper[4870]: I0130 08:21:18.215195 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s7dq\" (UniqueName: \"kubernetes.io/projected/bdb3e88d-691c-478c-ab03-cc84b8e04ea6-kube-api-access-7s7dq\") pod \"nmstate-operator-646758c888-sf8qk\" (UID: \"bdb3e88d-691c-478c-ab03-cc84b8e04ea6\") " pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" Jan 30 08:21:18 crc kubenswrapper[4870]: I0130 08:21:18.236264 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s7dq\" (UniqueName: \"kubernetes.io/projected/bdb3e88d-691c-478c-ab03-cc84b8e04ea6-kube-api-access-7s7dq\") pod \"nmstate-operator-646758c888-sf8qk\" (UID: \"bdb3e88d-691c-478c-ab03-cc84b8e04ea6\") " pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" Jan 30 08:21:18 crc kubenswrapper[4870]: I0130 08:21:18.271589 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" Jan 30 08:21:18 crc kubenswrapper[4870]: I0130 08:21:18.574470 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-sf8qk"] Jan 30 08:21:18 crc kubenswrapper[4870]: W0130 08:21:18.584487 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb3e88d_691c_478c_ab03_cc84b8e04ea6.slice/crio-1d8a664e6e2e4fee6cdb103f1e8125d571ea42ae6072e5ac837eb40e12ac0ae2 WatchSource:0}: Error finding container 1d8a664e6e2e4fee6cdb103f1e8125d571ea42ae6072e5ac837eb40e12ac0ae2: Status 404 returned error can't find the container with id 1d8a664e6e2e4fee6cdb103f1e8125d571ea42ae6072e5ac837eb40e12ac0ae2 Jan 30 08:21:19 crc kubenswrapper[4870]: I0130 08:21:19.174581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" event={"ID":"bdb3e88d-691c-478c-ab03-cc84b8e04ea6","Type":"ContainerStarted","Data":"1d8a664e6e2e4fee6cdb103f1e8125d571ea42ae6072e5ac837eb40e12ac0ae2"} Jan 30 08:21:21 crc kubenswrapper[4870]: I0130 08:21:21.189549 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" event={"ID":"bdb3e88d-691c-478c-ab03-cc84b8e04ea6","Type":"ContainerStarted","Data":"3251b84fc254256ff77143b377665fdb285a495e2a4dacb8ee280a918b2a834b"} Jan 30 08:21:21 crc kubenswrapper[4870]: I0130 08:21:21.213294 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-sf8qk" podStartSLOduration=2.048936836 podStartE2EDuration="4.213259418s" podCreationTimestamp="2026-01-30 08:21:17 +0000 UTC" firstStartedPulling="2026-01-30 08:21:18.587736579 +0000 UTC m=+717.283283708" lastFinishedPulling="2026-01-30 08:21:20.752059141 +0000 UTC m=+719.447606290" observedRunningTime="2026-01-30 08:21:21.209441919 +0000 UTC m=+719.904989058" watchObservedRunningTime="2026-01-30 08:21:21.213259418 +0000 UTC m=+719.908806567" Jan 30 08:21:25 crc kubenswrapper[4870]: I0130 08:21:25.249778 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:21:25 crc kubenswrapper[4870]: I0130 08:21:25.250560 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.752823 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xdc74"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.753776 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.759133 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hvv85" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.767317 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xdc74"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.792465 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.793434 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.796214 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.796766 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.808614 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tnl9h"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.809378 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.856694 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfb2\" (UniqueName: \"kubernetes.io/projected/86d16b9b-390e-442a-a74f-a9e32e92da59-kube-api-access-4zfb2\") pod \"nmstate-metrics-54757c584b-xdc74\" (UID: \"86d16b9b-390e-442a-a74f-a9e32e92da59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.917061 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.917716 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.922592 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.922665 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.935256 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9"] Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.938844 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9j4m5" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957568 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfb2\" (UniqueName: \"kubernetes.io/projected/86d16b9b-390e-442a-a74f-a9e32e92da59-kube-api-access-4zfb2\") pod \"nmstate-metrics-54757c584b-xdc74\" (UID: \"86d16b9b-390e-442a-a74f-a9e32e92da59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957613 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-ovs-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957681 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfxfj\" (UniqueName: \"kubernetes.io/projected/f38692e7-8fd1-48e1-ab3b-07cbac975021-kube-api-access-dfxfj\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-dbus-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957726 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-nmstate-lock\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957750 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p524q\" (UniqueName: \"kubernetes.io/projected/06799197-023a-4ed3-a378-9a1fbf25fda2-kube-api-access-p524q\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.957768 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:27 crc kubenswrapper[4870]: I0130 08:21:27.987702 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfb2\" (UniqueName: \"kubernetes.io/projected/86d16b9b-390e-442a-a74f-a9e32e92da59-kube-api-access-4zfb2\") pod \"nmstate-metrics-54757c584b-xdc74\" (UID: \"86d16b9b-390e-442a-a74f-a9e32e92da59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059294 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-nmstate-lock\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059345 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p524q\" (UniqueName: \"kubernetes.io/projected/06799197-023a-4ed3-a378-9a1fbf25fda2-kube-api-access-p524q\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059366 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059411 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-ovs-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059450 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh7d\" (UniqueName: \"kubernetes.io/projected/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-kube-api-access-zwh7d\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059476 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059510 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfxfj\" (UniqueName: \"kubernetes.io/projected/f38692e7-8fd1-48e1-ab3b-07cbac975021-kube-api-access-dfxfj\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: E0130 08:21:28.059518 4870 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059530 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059563 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-ovs-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: E0130 08:21:28.059651 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair podName:06799197-023a-4ed3-a378-9a1fbf25fda2 nodeName:}" failed. No retries permitted until 2026-01-30 08:21:28.559632392 +0000 UTC m=+727.255179501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-rsk45" (UID: "06799197-023a-4ed3-a378-9a1fbf25fda2") : secret "openshift-nmstate-webhook" not found Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059675 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-dbus-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.059683 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-nmstate-lock\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.060042 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f38692e7-8fd1-48e1-ab3b-07cbac975021-dbus-socket\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.071318 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.082389 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfxfj\" (UniqueName: \"kubernetes.io/projected/f38692e7-8fd1-48e1-ab3b-07cbac975021-kube-api-access-dfxfj\") pod \"nmstate-handler-tnl9h\" (UID: \"f38692e7-8fd1-48e1-ab3b-07cbac975021\") " pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.100982 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p524q\" (UniqueName: \"kubernetes.io/projected/06799197-023a-4ed3-a378-9a1fbf25fda2-kube-api-access-p524q\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.143427 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.160424 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh7d\" (UniqueName: \"kubernetes.io/projected/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-kube-api-access-zwh7d\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.160475 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.160517 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.161696 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.170268 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.175339 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5897df5b9-8hs5t"] Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.176363 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.197970 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5897df5b9-8hs5t"] Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.200925 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh7d\" (UniqueName: \"kubernetes.io/projected/b7e9a284-8b5c-4ae7-b388-3e9f907082d2-kube-api-access-zwh7d\") pod \"nmstate-console-plugin-7754f76f8b-ql9j9\" (UID: \"b7e9a284-8b5c-4ae7-b388-3e9f907082d2\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.239447 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265512 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-trusted-ca-bundle\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265548 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-oauth-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265570 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-service-ca\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265635 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skr2v\" (UniqueName: \"kubernetes.io/projected/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-kube-api-access-skr2v\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265678 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.265706 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-oauth-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.366859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-trusted-ca-bundle\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.366906 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-oauth-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.366926 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-service-ca\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.366943 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.366986 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skr2v\" (UniqueName: \"kubernetes.io/projected/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-kube-api-access-skr2v\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.367021 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.367060 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-oauth-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.367888 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-oauth-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.368112 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.368492 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-service-ca\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.368682 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-trusted-ca-bundle\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.372643 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-serving-cert\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.372981 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-console-oauth-config\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.385853 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skr2v\" (UniqueName: \"kubernetes.io/projected/6eacb293-6cdf-4bfa-ad11-c81ea261a90c-kube-api-access-skr2v\") pod \"console-5897df5b9-8hs5t\" (UID: \"6eacb293-6cdf-4bfa-ad11-c81ea261a90c\") " pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.405974 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xdc74"] Jan 30 08:21:28 crc kubenswrapper[4870]: W0130 08:21:28.453344 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d16b9b_390e_442a_a74f_a9e32e92da59.slice/crio-358e4514ffb84330fd25e69ee0dc892c8d14eda97624943fd2053139b44896ef WatchSource:0}: Error finding container 358e4514ffb84330fd25e69ee0dc892c8d14eda97624943fd2053139b44896ef: Status 404 returned error can't find the container with id 358e4514ffb84330fd25e69ee0dc892c8d14eda97624943fd2053139b44896ef Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.553782 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.569355 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.572651 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/06799197-023a-4ed3-a378-9a1fbf25fda2-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-rsk45\" (UID: \"06799197-023a-4ed3-a378-9a1fbf25fda2\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.585514 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9"] Jan 30 08:21:28 crc kubenswrapper[4870]: W0130 08:21:28.596545 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e9a284_8b5c_4ae7_b388_3e9f907082d2.slice/crio-0ef8512f36f7725ccd0d4431188276aa474d003ed9ff559065ba6e4a430cec4e WatchSource:0}: Error finding container 0ef8512f36f7725ccd0d4431188276aa474d003ed9ff559065ba6e4a430cec4e: Status 404 returned error can't find the container with id 0ef8512f36f7725ccd0d4431188276aa474d003ed9ff559065ba6e4a430cec4e Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.722347 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.769765 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5897df5b9-8hs5t"] Jan 30 08:21:28 crc kubenswrapper[4870]: I0130 08:21:28.961792 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45"] Jan 30 08:21:28 crc kubenswrapper[4870]: W0130 08:21:28.968443 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06799197_023a_4ed3_a378_9a1fbf25fda2.slice/crio-4710f89c399c1b9235af97b8274073cc7b4d5dbf3832d3061f5bc9c987aa10ba WatchSource:0}: Error finding container 4710f89c399c1b9235af97b8274073cc7b4d5dbf3832d3061f5bc9c987aa10ba: Status 404 returned error can't find the container with id 4710f89c399c1b9235af97b8274073cc7b4d5dbf3832d3061f5bc9c987aa10ba Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.253063 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5897df5b9-8hs5t" event={"ID":"6eacb293-6cdf-4bfa-ad11-c81ea261a90c","Type":"ContainerStarted","Data":"e055e94b96e4216d6559e74e36e9cd294ac30f34705e81bec5f402a7223bc3a9"} Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.253392 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5897df5b9-8hs5t" event={"ID":"6eacb293-6cdf-4bfa-ad11-c81ea261a90c","Type":"ContainerStarted","Data":"10a27ce2c850faa744d614b32c949e7a8f6a20dfd82ff067bf94e290381b71f6"} Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.255763 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tnl9h" event={"ID":"f38692e7-8fd1-48e1-ab3b-07cbac975021","Type":"ContainerStarted","Data":"714384e6f35d8667a2c0ec2049a76dedac05d249c1bf5d72b57cf6212965cfa8"} Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.257184 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" event={"ID":"06799197-023a-4ed3-a378-9a1fbf25fda2","Type":"ContainerStarted","Data":"4710f89c399c1b9235af97b8274073cc7b4d5dbf3832d3061f5bc9c987aa10ba"} Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.258786 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" event={"ID":"86d16b9b-390e-442a-a74f-a9e32e92da59","Type":"ContainerStarted","Data":"358e4514ffb84330fd25e69ee0dc892c8d14eda97624943fd2053139b44896ef"} Jan 30 08:21:29 crc kubenswrapper[4870]: I0130 08:21:29.260241 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" event={"ID":"b7e9a284-8b5c-4ae7-b388-3e9f907082d2","Type":"ContainerStarted","Data":"0ef8512f36f7725ccd0d4431188276aa474d003ed9ff559065ba6e4a430cec4e"} Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.113550 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5897df5b9-8hs5t" podStartSLOduration=4.113531443 podStartE2EDuration="4.113531443s" podCreationTimestamp="2026-01-30 08:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:21:29.282453007 +0000 UTC m=+727.978000136" watchObservedRunningTime="2026-01-30 08:21:32.113531443 +0000 UTC m=+730.809078562" Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.284380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tnl9h" event={"ID":"f38692e7-8fd1-48e1-ab3b-07cbac975021","Type":"ContainerStarted","Data":"41083f37094394c08c285d7c08f4c84f3894891efff7c6dd2c28be0016027952"} Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.284733 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.286010 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" event={"ID":"06799197-023a-4ed3-a378-9a1fbf25fda2","Type":"ContainerStarted","Data":"1f845c3c8beb7dddd18c70f163a7aa11918010fc775f0a1a1d0a80c939beacfb"} Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.286160 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.287582 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" event={"ID":"86d16b9b-390e-442a-a74f-a9e32e92da59","Type":"ContainerStarted","Data":"75ecda3d83b35afa0c77ab887b7cfc05296e8eb903762fb7374a7f71db88c907"} Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.289304 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" event={"ID":"b7e9a284-8b5c-4ae7-b388-3e9f907082d2","Type":"ContainerStarted","Data":"a23a103ca806d886d2eb48f5f9debd758377d6a78ef7d2c350b6e804e3c01268"} Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.303409 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tnl9h" podStartSLOduration=2.119997817 podStartE2EDuration="5.303378985s" podCreationTimestamp="2026-01-30 08:21:27 +0000 UTC" firstStartedPulling="2026-01-30 08:21:28.303024408 +0000 UTC m=+726.998571517" lastFinishedPulling="2026-01-30 08:21:31.486405536 +0000 UTC m=+730.181952685" observedRunningTime="2026-01-30 08:21:32.301365082 +0000 UTC m=+730.996912231" watchObservedRunningTime="2026-01-30 08:21:32.303378985 +0000 UTC m=+730.998926104" Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.320480 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-ql9j9" podStartSLOduration=2.455715438 podStartE2EDuration="5.320459993s" podCreationTimestamp="2026-01-30 08:21:27 +0000 UTC" firstStartedPulling="2026-01-30 08:21:28.601153096 +0000 UTC m=+727.296700205" lastFinishedPulling="2026-01-30 08:21:31.465897641 +0000 UTC m=+730.161444760" observedRunningTime="2026-01-30 08:21:32.319165341 +0000 UTC m=+731.014712440" watchObservedRunningTime="2026-01-30 08:21:32.320459993 +0000 UTC m=+731.016007102" Jan 30 08:21:32 crc kubenswrapper[4870]: I0130 08:21:32.352273 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" podStartSLOduration=2.836967319 podStartE2EDuration="5.352250782s" podCreationTimestamp="2026-01-30 08:21:27 +0000 UTC" firstStartedPulling="2026-01-30 08:21:28.971103703 +0000 UTC m=+727.666650812" lastFinishedPulling="2026-01-30 08:21:31.486387126 +0000 UTC m=+730.181934275" observedRunningTime="2026-01-30 08:21:32.346816191 +0000 UTC m=+731.042363320" watchObservedRunningTime="2026-01-30 08:21:32.352250782 +0000 UTC m=+731.047797901" Jan 30 08:21:34 crc kubenswrapper[4870]: I0130 08:21:34.307416 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" event={"ID":"86d16b9b-390e-442a-a74f-a9e32e92da59","Type":"ContainerStarted","Data":"4d4ac78af20139f6a6ff96b6b70747a75a04e971452dbdf329654029c1e3adc6"} Jan 30 08:21:34 crc kubenswrapper[4870]: I0130 08:21:34.328677 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-xdc74" podStartSLOduration=1.934031157 podStartE2EDuration="7.328651472s" podCreationTimestamp="2026-01-30 08:21:27 +0000 UTC" firstStartedPulling="2026-01-30 08:21:28.456042561 +0000 UTC m=+727.151589670" lastFinishedPulling="2026-01-30 08:21:33.850662856 +0000 UTC m=+732.546209985" observedRunningTime="2026-01-30 08:21:34.324022626 +0000 UTC m=+733.019569735" watchObservedRunningTime="2026-01-30 08:21:34.328651472 +0000 UTC m=+733.024198581" Jan 30 08:21:38 crc kubenswrapper[4870]: I0130 08:21:38.220850 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tnl9h" Jan 30 08:21:38 crc kubenswrapper[4870]: I0130 08:21:38.554870 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:38 crc kubenswrapper[4870]: I0130 08:21:38.554999 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:38 crc kubenswrapper[4870]: I0130 08:21:38.560755 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:39 crc kubenswrapper[4870]: I0130 08:21:39.353559 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5897df5b9-8hs5t" Jan 30 08:21:39 crc kubenswrapper[4870]: I0130 08:21:39.424669 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:21:48 crc kubenswrapper[4870]: I0130 08:21:48.731285 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-rsk45" Jan 30 08:21:55 crc kubenswrapper[4870]: I0130 08:21:55.249270 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:21:55 crc kubenswrapper[4870]: I0130 08:21:55.250293 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:22:04 crc kubenswrapper[4870]: I0130 08:22:04.507600 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2mj87" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" containerName="console" containerID="cri-o://8f4187e8ca6a92ee4bd9e6838556b7bbedaba64d18a5aff0c37ced233ebdc3dd" gracePeriod=15 Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.571639 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2mj87_2aa49ce7-f902-408a-94f1-da14a661e813/console/0.log" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.572120 4870 generic.go:334] "Generic (PLEG): container finished" podID="2aa49ce7-f902-408a-94f1-da14a661e813" containerID="8f4187e8ca6a92ee4bd9e6838556b7bbedaba64d18a5aff0c37ced233ebdc3dd" exitCode=2 Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.572161 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2mj87" event={"ID":"2aa49ce7-f902-408a-94f1-da14a661e813","Type":"ContainerDied","Data":"8f4187e8ca6a92ee4bd9e6838556b7bbedaba64d18a5aff0c37ced233ebdc3dd"} Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.577224 4870 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.749348 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2mj87_2aa49ce7-f902-408a-94f1-da14a661e813/console/0.log" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.749431 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.777767 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.777852 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.777916 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.777953 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.777999 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.778037 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.778067 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6hw\" (UniqueName: \"kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw\") pod \"2aa49ce7-f902-408a-94f1-da14a661e813\" (UID: \"2aa49ce7-f902-408a-94f1-da14a661e813\") " Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.778828 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.778942 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca" (OuterVolumeSpecName: "service-ca") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.779344 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.779559 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config" (OuterVolumeSpecName: "console-config") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.786616 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.787513 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.802587 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw" (OuterVolumeSpecName: "kube-api-access-ct6hw") pod "2aa49ce7-f902-408a-94f1-da14a661e813" (UID: "2aa49ce7-f902-408a-94f1-da14a661e813"). InnerVolumeSpecName "kube-api-access-ct6hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879568 4870 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879603 4870 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879613 4870 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879622 4870 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879631 4870 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa49ce7-f902-408a-94f1-da14a661e813-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879640 4870 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2aa49ce7-f902-408a-94f1-da14a661e813-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:05 crc kubenswrapper[4870]: I0130 08:22:05.879649 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6hw\" (UniqueName: \"kubernetes.io/projected/2aa49ce7-f902-408a-94f1-da14a661e813-kube-api-access-ct6hw\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.276470 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp"] Jan 30 08:22:06 crc kubenswrapper[4870]: E0130 08:22:06.277283 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" containerName="console" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.277380 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" containerName="console" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.277628 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" containerName="console" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.278971 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.283231 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.284113 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5k4\" (UniqueName: \"kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.284286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.284339 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.298472 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp"] Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.385608 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.386145 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.386411 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5k4\" (UniqueName: \"kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.386555 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.386667 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.404631 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5k4\" (UniqueName: \"kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.581099 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2mj87_2aa49ce7-f902-408a-94f1-da14a661e813/console/0.log" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.581175 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2mj87" event={"ID":"2aa49ce7-f902-408a-94f1-da14a661e813","Type":"ContainerDied","Data":"11900425e10bfa9bf6c9c649d5dac8048b1ed7e104a45655b98935b712a80d21"} Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.581231 4870 scope.go:117] "RemoveContainer" containerID="8f4187e8ca6a92ee4bd9e6838556b7bbedaba64d18a5aff0c37ced233ebdc3dd" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.581264 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2mj87" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.593481 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.619142 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.631424 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2mj87"] Jan 30 08:22:06 crc kubenswrapper[4870]: I0130 08:22:06.900848 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp"] Jan 30 08:22:07 crc kubenswrapper[4870]: I0130 08:22:07.590823 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerID="23ad735f0ffeebd3480c7cdebe5a8540768ddd6875662b4db45bf411655e8342" exitCode=0 Jan 30 08:22:07 crc kubenswrapper[4870]: I0130 08:22:07.590950 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" event={"ID":"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef","Type":"ContainerDied","Data":"23ad735f0ffeebd3480c7cdebe5a8540768ddd6875662b4db45bf411655e8342"} Jan 30 08:22:07 crc kubenswrapper[4870]: I0130 08:22:07.591213 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" event={"ID":"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef","Type":"ContainerStarted","Data":"34734030669c64ebec5619500a21ee834583e2523ce16b72917ee829c9a330c2"} Jan 30 08:22:08 crc kubenswrapper[4870]: I0130 08:22:08.083629 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa49ce7-f902-408a-94f1-da14a661e813" path="/var/lib/kubelet/pods/2aa49ce7-f902-408a-94f1-da14a661e813/volumes" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.613384 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerID="ea03883aaea5dae7986706ea4e1c998aca37b9c10769ae0311753a2237efe194" exitCode=0 Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.613928 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" event={"ID":"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef","Type":"ContainerDied","Data":"ea03883aaea5dae7986706ea4e1c998aca37b9c10769ae0311753a2237efe194"} Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.614821 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.616832 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.627713 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.789795 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.789901 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2hf\" (UniqueName: \"kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.790020 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.891033 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.891404 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2hf\" (UniqueName: \"kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.891453 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.891625 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.892135 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:09 crc kubenswrapper[4870]: I0130 08:22:09.925125 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2hf\" (UniqueName: \"kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf\") pod \"redhat-operators-gt6dl\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.036648 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.305373 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:10 crc kubenswrapper[4870]: W0130 08:22:10.314496 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56954f4_c2bc_42a1_bfa9_51433acd8c15.slice/crio-16fd0a4741adbeb0be8f4ebb1f22a43c1cc7a74b395ee8549eea57d0f522edff WatchSource:0}: Error finding container 16fd0a4741adbeb0be8f4ebb1f22a43c1cc7a74b395ee8549eea57d0f522edff: Status 404 returned error can't find the container with id 16fd0a4741adbeb0be8f4ebb1f22a43c1cc7a74b395ee8549eea57d0f522edff Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.622109 4870 generic.go:334] "Generic (PLEG): container finished" podID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerID="ff6385d320f64d2aa5da3b6fecd0209ec974596a453947b1b96d5f37c7910d99" exitCode=0 Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.622198 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" event={"ID":"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef","Type":"ContainerDied","Data":"ff6385d320f64d2aa5da3b6fecd0209ec974596a453947b1b96d5f37c7910d99"} Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.624318 4870 generic.go:334] "Generic (PLEG): container finished" podID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerID="86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c" exitCode=0 Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.624384 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerDied","Data":"86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c"} Jan 30 08:22:10 crc kubenswrapper[4870]: I0130 08:22:10.624427 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerStarted","Data":"16fd0a4741adbeb0be8f4ebb1f22a43c1cc7a74b395ee8549eea57d0f522edff"} Jan 30 08:22:11 crc kubenswrapper[4870]: I0130 08:22:11.637576 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerStarted","Data":"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067"} Jan 30 08:22:11 crc kubenswrapper[4870]: I0130 08:22:11.962511 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.139783 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle\") pod \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.139863 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm5k4\" (UniqueName: \"kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4\") pod \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.139952 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util\") pod \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\" (UID: \"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef\") " Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.141808 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle" (OuterVolumeSpecName: "bundle") pod "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" (UID: "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.157813 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4" (OuterVolumeSpecName: "kube-api-access-nm5k4") pod "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" (UID: "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef"). InnerVolumeSpecName "kube-api-access-nm5k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.170589 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util" (OuterVolumeSpecName: "util") pod "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" (UID: "f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.242137 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.242191 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm5k4\" (UniqueName: \"kubernetes.io/projected/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-kube-api-access-nm5k4\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.242213 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef-util\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.649198 4870 generic.go:334] "Generic (PLEG): container finished" podID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerID="eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067" exitCode=0 Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.650436 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerDied","Data":"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067"} Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.656040 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" event={"ID":"f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef","Type":"ContainerDied","Data":"34734030669c64ebec5619500a21ee834583e2523ce16b72917ee829c9a330c2"} Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.656102 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34734030669c64ebec5619500a21ee834583e2523ce16b72917ee829c9a330c2" Jan 30 08:22:12 crc kubenswrapper[4870]: I0130 08:22:12.656227 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp" Jan 30 08:22:13 crc kubenswrapper[4870]: I0130 08:22:13.678198 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerStarted","Data":"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3"} Jan 30 08:22:20 crc kubenswrapper[4870]: I0130 08:22:20.037711 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:20 crc kubenswrapper[4870]: I0130 08:22:20.038356 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.102145 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gt6dl" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="registry-server" probeResult="failure" output=< Jan 30 08:22:21 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:22:21 crc kubenswrapper[4870]: > Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.494054 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gt6dl" podStartSLOduration=9.987654669 podStartE2EDuration="12.494029781s" podCreationTimestamp="2026-01-30 08:22:09 +0000 UTC" firstStartedPulling="2026-01-30 08:22:10.626054238 +0000 UTC m=+769.321601347" lastFinishedPulling="2026-01-30 08:22:13.13242931 +0000 UTC m=+771.827976459" observedRunningTime="2026-01-30 08:22:13.707471108 +0000 UTC m=+772.403018217" watchObservedRunningTime="2026-01-30 08:22:21.494029781 +0000 UTC m=+780.189576890" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.496636 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-567987c4fc-ff527"] Jan 30 08:22:21 crc kubenswrapper[4870]: E0130 08:22:21.496955 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="util" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.496974 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="util" Jan 30 08:22:21 crc kubenswrapper[4870]: E0130 08:22:21.496988 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="pull" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.496995 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="pull" Jan 30 08:22:21 crc kubenswrapper[4870]: E0130 08:22:21.497014 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="extract" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.497021 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="extract" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.497178 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef" containerName="extract" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.497696 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.500648 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.500921 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.500986 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-svhct" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.501325 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.501354 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.529656 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-567987c4fc-ff527"] Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.684065 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62rqb\" (UniqueName: \"kubernetes.io/projected/70a9e498-4f2a-40ff-8837-7811ffe26e2d-kube-api-access-62rqb\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.684166 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-apiservice-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.684227 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-webhook-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.785952 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-apiservice-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.786051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-webhook-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.786091 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62rqb\" (UniqueName: \"kubernetes.io/projected/70a9e498-4f2a-40ff-8837-7811ffe26e2d-kube-api-access-62rqb\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.796061 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-apiservice-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.808187 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70a9e498-4f2a-40ff-8837-7811ffe26e2d-webhook-cert\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.815981 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62rqb\" (UniqueName: \"kubernetes.io/projected/70a9e498-4f2a-40ff-8837-7811ffe26e2d-kube-api-access-62rqb\") pod \"metallb-operator-controller-manager-567987c4fc-ff527\" (UID: \"70a9e498-4f2a-40ff-8837-7811ffe26e2d\") " pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.851040 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt"] Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.852144 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.859373 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.859442 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.867936 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5nv88" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.887505 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt"] Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.990086 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-webhook-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.990136 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q726w\" (UniqueName: \"kubernetes.io/projected/f01bc9ba-9427-4c0a-927e-56b20aca72c5-kube-api-access-q726w\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:21 crc kubenswrapper[4870]: I0130 08:22:21.990180 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-apiservice-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.091124 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-webhook-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.091173 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q726w\" (UniqueName: \"kubernetes.io/projected/f01bc9ba-9427-4c0a-927e-56b20aca72c5-kube-api-access-q726w\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.091220 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-apiservice-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.096972 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.107888 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q726w\" (UniqueName: \"kubernetes.io/projected/f01bc9ba-9427-4c0a-927e-56b20aca72c5-kube-api-access-q726w\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.107959 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-apiservice-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.109555 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f01bc9ba-9427-4c0a-927e-56b20aca72c5-webhook-cert\") pod \"metallb-operator-webhook-server-d5db5fbbd-k8pwt\" (UID: \"f01bc9ba-9427-4c0a-927e-56b20aca72c5\") " pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.116336 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-svhct" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.124422 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.172548 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5nv88" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.181079 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.473530 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt"] Jan 30 08:22:22 crc kubenswrapper[4870]: W0130 08:22:22.490082 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01bc9ba_9427_4c0a_927e_56b20aca72c5.slice/crio-9549964e10582ed8ca13342baf610dd7408f97e3f2ba3743c7d0e8388d2b49a0 WatchSource:0}: Error finding container 9549964e10582ed8ca13342baf610dd7408f97e3f2ba3743c7d0e8388d2b49a0: Status 404 returned error can't find the container with id 9549964e10582ed8ca13342baf610dd7408f97e3f2ba3743c7d0e8388d2b49a0 Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.558185 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-567987c4fc-ff527"] Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.736090 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" event={"ID":"70a9e498-4f2a-40ff-8837-7811ffe26e2d","Type":"ContainerStarted","Data":"c7ccb18c107754ef3074b7af2eceba1094f5baf988020718b4e2e2e61eed0fb2"} Jan 30 08:22:22 crc kubenswrapper[4870]: I0130 08:22:22.737019 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" event={"ID":"f01bc9ba-9427-4c0a-927e-56b20aca72c5","Type":"ContainerStarted","Data":"9549964e10582ed8ca13342baf610dd7408f97e3f2ba3743c7d0e8388d2b49a0"} Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.249211 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.249672 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.249714 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.250105 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.250155 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be" gracePeriod=600 Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.781066 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be" exitCode=0 Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.781117 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be"} Jan 30 08:22:25 crc kubenswrapper[4870]: I0130 08:22:25.781159 4870 scope.go:117] "RemoveContainer" containerID="f6c12e6d68c222de0711d262165234642f23c035c994270d4c786852d266f7a2" Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.823532 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" event={"ID":"f01bc9ba-9427-4c0a-927e-56b20aca72c5","Type":"ContainerStarted","Data":"ecc3d94a340fe5228a9072bfacd4d01f19a2b249b4f96ed382ae75b45ba4c200"} Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.824024 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.825493 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" event={"ID":"70a9e498-4f2a-40ff-8837-7811ffe26e2d","Type":"ContainerStarted","Data":"98496fd06f8eabaefdd0852a92ad197c462f38deda3bdb41f6c1fdf198b25ff5"} Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.825620 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.828209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac"} Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.849641 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" podStartSLOduration=2.608422007 podStartE2EDuration="7.849617815s" podCreationTimestamp="2026-01-30 08:22:21 +0000 UTC" firstStartedPulling="2026-01-30 08:22:22.512075434 +0000 UTC m=+781.207622533" lastFinishedPulling="2026-01-30 08:22:27.753271202 +0000 UTC m=+786.448818341" observedRunningTime="2026-01-30 08:22:28.843459092 +0000 UTC m=+787.539006211" watchObservedRunningTime="2026-01-30 08:22:28.849617815 +0000 UTC m=+787.545164934" Jan 30 08:22:28 crc kubenswrapper[4870]: I0130 08:22:28.873937 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" podStartSLOduration=2.743526362 podStartE2EDuration="7.873909368s" podCreationTimestamp="2026-01-30 08:22:21 +0000 UTC" firstStartedPulling="2026-01-30 08:22:22.600647267 +0000 UTC m=+781.296194376" lastFinishedPulling="2026-01-30 08:22:27.731030233 +0000 UTC m=+786.426577382" observedRunningTime="2026-01-30 08:22:28.869689145 +0000 UTC m=+787.565236304" watchObservedRunningTime="2026-01-30 08:22:28.873909368 +0000 UTC m=+787.569456497" Jan 30 08:22:30 crc kubenswrapper[4870]: I0130 08:22:30.102549 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:30 crc kubenswrapper[4870]: I0130 08:22:30.160002 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:30 crc kubenswrapper[4870]: I0130 08:22:30.409220 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:31 crc kubenswrapper[4870]: I0130 08:22:31.857970 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gt6dl" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="registry-server" containerID="cri-o://cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3" gracePeriod=2 Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.337527 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.455977 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x2hf\" (UniqueName: \"kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf\") pod \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.456039 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities\") pod \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.456083 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content\") pod \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\" (UID: \"d56954f4-c2bc-42a1-bfa9-51433acd8c15\") " Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.456992 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities" (OuterVolumeSpecName: "utilities") pod "d56954f4-c2bc-42a1-bfa9-51433acd8c15" (UID: "d56954f4-c2bc-42a1-bfa9-51433acd8c15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.469598 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf" (OuterVolumeSpecName: "kube-api-access-9x2hf") pod "d56954f4-c2bc-42a1-bfa9-51433acd8c15" (UID: "d56954f4-c2bc-42a1-bfa9-51433acd8c15"). InnerVolumeSpecName "kube-api-access-9x2hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.558036 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x2hf\" (UniqueName: \"kubernetes.io/projected/d56954f4-c2bc-42a1-bfa9-51433acd8c15-kube-api-access-9x2hf\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.558076 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.570529 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d56954f4-c2bc-42a1-bfa9-51433acd8c15" (UID: "d56954f4-c2bc-42a1-bfa9-51433acd8c15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.659415 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56954f4-c2bc-42a1-bfa9-51433acd8c15-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.885075 4870 generic.go:334] "Generic (PLEG): container finished" podID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerID="cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3" exitCode=0 Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.885156 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerDied","Data":"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3"} Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.885217 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gt6dl" event={"ID":"d56954f4-c2bc-42a1-bfa9-51433acd8c15","Type":"ContainerDied","Data":"16fd0a4741adbeb0be8f4ebb1f22a43c1cc7a74b395ee8549eea57d0f522edff"} Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.885230 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gt6dl" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.885252 4870 scope.go:117] "RemoveContainer" containerID="cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.919108 4870 scope.go:117] "RemoveContainer" containerID="eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.935216 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.939289 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gt6dl"] Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.961207 4870 scope.go:117] "RemoveContainer" containerID="86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.984813 4870 scope.go:117] "RemoveContainer" containerID="cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3" Jan 30 08:22:32 crc kubenswrapper[4870]: E0130 08:22:32.985489 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3\": container with ID starting with cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3 not found: ID does not exist" containerID="cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.985581 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3"} err="failed to get container status \"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3\": rpc error: code = NotFound desc = could not find container \"cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3\": container with ID starting with cb0008e33f9dfe8f12fe1500f43e595181b49501dfd1a6a4cb9b1e2bb46080c3 not found: ID does not exist" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.985642 4870 scope.go:117] "RemoveContainer" containerID="eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067" Jan 30 08:22:32 crc kubenswrapper[4870]: E0130 08:22:32.986326 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067\": container with ID starting with eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067 not found: ID does not exist" containerID="eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.986417 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067"} err="failed to get container status \"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067\": rpc error: code = NotFound desc = could not find container \"eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067\": container with ID starting with eba381f30e92b7fbaef86ea5a4e2e0d1066c124c884cb019360f295ad221d067 not found: ID does not exist" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.986483 4870 scope.go:117] "RemoveContainer" containerID="86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c" Jan 30 08:22:32 crc kubenswrapper[4870]: E0130 08:22:32.986923 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c\": container with ID starting with 86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c not found: ID does not exist" containerID="86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c" Jan 30 08:22:32 crc kubenswrapper[4870]: I0130 08:22:32.986968 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c"} err="failed to get container status \"86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c\": rpc error: code = NotFound desc = could not find container \"86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c\": container with ID starting with 86dedfe79832a79b748c29ec54e77e723fd9e1f44571d75dfdd4f0e65591ae8c not found: ID does not exist" Jan 30 08:22:34 crc kubenswrapper[4870]: I0130 08:22:34.083556 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" path="/var/lib/kubelet/pods/d56954f4-c2bc-42a1-bfa9-51433acd8c15/volumes" Jan 30 08:22:42 crc kubenswrapper[4870]: I0130 08:22:42.184702 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d5db5fbbd-k8pwt" Jan 30 08:23:02 crc kubenswrapper[4870]: I0130 08:23:02.128395 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-567987c4fc-ff527" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.004719 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zwhkv"] Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.005329 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="registry-server" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.005342 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="registry-server" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.005361 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="extract-utilities" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.005367 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="extract-utilities" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.005375 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="extract-content" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.005381 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="extract-content" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.005491 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56954f4-c2bc-42a1-bfa9-51433acd8c15" containerName="registry-server" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.007380 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.010946 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-v8xl8" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.014620 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.019690 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.021126 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.022464 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.023982 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.055570 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.117073 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7q5pn"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.124221 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.127218 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.129565 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.130037 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pjwvb" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.130057 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160100 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-metrics\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160290 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzx2p\" (UniqueName: \"kubernetes.io/projected/5d3d6557-5b19-47c3-9e81-09b8dee3b239-kube-api-access-tzx2p\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160410 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94sp\" (UniqueName: \"kubernetes.io/projected/008f589d-dab4-42af-9a42-cb6c00737f44-kube-api-access-n94sp\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160546 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d3d6557-5b19-47c3-9e81-09b8dee3b239-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160641 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/008f589d-dab4-42af-9a42-cb6c00737f44-frr-startup\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160720 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-conf\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.160781 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-reloader\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.161028 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-sockets\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.152502 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-2dwrk"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.165800 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.173261 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.229863 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2dwrk"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263354 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263497 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94sp\" (UniqueName: \"kubernetes.io/projected/008f589d-dab4-42af-9a42-cb6c00737f44-kube-api-access-n94sp\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263580 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263621 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d3d6557-5b19-47c3-9e81-09b8dee3b239-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263688 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/008f589d-dab4-42af-9a42-cb6c00737f44-frr-startup\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263740 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmxn\" (UniqueName: \"kubernetes.io/projected/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-kube-api-access-brmxn\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263776 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-conf\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263820 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69ng\" (UniqueName: \"kubernetes.io/projected/84099c66-a13e-4949-ae36-7fa85a6a6a56-kube-api-access-j69ng\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263837 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263856 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-reloader\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263932 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-sockets\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.263986 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84099c66-a13e-4949-ae36-7fa85a6a6a56-metallb-excludel2\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.264051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-metrics\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.264094 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.264130 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-cert\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.264158 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzx2p\" (UniqueName: \"kubernetes.io/projected/5d3d6557-5b19-47c3-9e81-09b8dee3b239-kube-api-access-tzx2p\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.264820 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-reloader\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.265042 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-sockets\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.265227 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-metrics\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.265931 4870 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.266196 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs podName:008f589d-dab4-42af-9a42-cb6c00737f44 nodeName:}" failed. No retries permitted until 2026-01-30 08:23:03.76618111 +0000 UTC m=+822.461728219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs") pod "frr-k8s-zwhkv" (UID: "008f589d-dab4-42af-9a42-cb6c00737f44") : secret "frr-k8s-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.266813 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/008f589d-dab4-42af-9a42-cb6c00737f44-frr-conf\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.266950 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/008f589d-dab4-42af-9a42-cb6c00737f44-frr-startup\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.288186 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d3d6557-5b19-47c3-9e81-09b8dee3b239-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.291401 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94sp\" (UniqueName: \"kubernetes.io/projected/008f589d-dab4-42af-9a42-cb6c00737f44-kube-api-access-n94sp\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.291951 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzx2p\" (UniqueName: \"kubernetes.io/projected/5d3d6557-5b19-47c3-9e81-09b8dee3b239-kube-api-access-tzx2p\") pod \"frr-k8s-webhook-server-7df86c4f6c-pnddd\" (UID: \"5d3d6557-5b19-47c3-9e81-09b8dee3b239\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.336948 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365494 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84099c66-a13e-4949-ae36-7fa85a6a6a56-metallb-excludel2\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365541 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365561 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-cert\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365585 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365644 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmxn\" (UniqueName: \"kubernetes.io/projected/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-kube-api-access-brmxn\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365667 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69ng\" (UniqueName: \"kubernetes.io/projected/84099c66-a13e-4949-ae36-7fa85a6a6a56-kube-api-access-j69ng\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.365684 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.365797 4870 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.365834 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs podName:b8c43bdb-2bfa-445b-9526-a03eb3f3ca20 nodeName:}" failed. No retries permitted until 2026-01-30 08:23:03.865822031 +0000 UTC m=+822.561369140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs") pod "controller-6968d8fdc4-2dwrk" (UID: "b8c43bdb-2bfa-445b-9526-a03eb3f3ca20") : secret "controller-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.366475 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84099c66-a13e-4949-ae36-7fa85a6a6a56-metallb-excludel2\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.366535 4870 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.366563 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist podName:84099c66-a13e-4949-ae36-7fa85a6a6a56 nodeName:}" failed. No retries permitted until 2026-01-30 08:23:03.866554254 +0000 UTC m=+822.562101363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist") pod "speaker-7q5pn" (UID: "84099c66-a13e-4949-ae36-7fa85a6a6a56") : secret "metallb-memberlist" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.366624 4870 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.366648 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs podName:84099c66-a13e-4949-ae36-7fa85a6a6a56 nodeName:}" failed. No retries permitted until 2026-01-30 08:23:03.866642086 +0000 UTC m=+822.562189195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs") pod "speaker-7q5pn" (UID: "84099c66-a13e-4949-ae36-7fa85a6a6a56") : secret "speaker-certs-secret" not found Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.368204 4870 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.381791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-cert\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.381955 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69ng\" (UniqueName: \"kubernetes.io/projected/84099c66-a13e-4949-ae36-7fa85a6a6a56-kube-api-access-j69ng\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.385807 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmxn\" (UniqueName: \"kubernetes.io/projected/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-kube-api-access-brmxn\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.766821 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd"] Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.772291 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.781989 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008f589d-dab4-42af-9a42-cb6c00737f44-metrics-certs\") pod \"frr-k8s-zwhkv\" (UID: \"008f589d-dab4-42af-9a42-cb6c00737f44\") " pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.874374 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.874988 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.875053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.875338 4870 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 08:23:03 crc kubenswrapper[4870]: E0130 08:23:03.875493 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist podName:84099c66-a13e-4949-ae36-7fa85a6a6a56 nodeName:}" failed. No retries permitted until 2026-01-30 08:23:04.875453242 +0000 UTC m=+823.571000531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist") pod "speaker-7q5pn" (UID: "84099c66-a13e-4949-ae36-7fa85a6a6a56") : secret "metallb-memberlist" not found Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.881236 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-metrics-certs\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.883211 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8c43bdb-2bfa-445b-9526-a03eb3f3ca20-metrics-certs\") pod \"controller-6968d8fdc4-2dwrk\" (UID: \"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20\") " pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:03 crc kubenswrapper[4870]: I0130 08:23:03.928436 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.118011 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.165742 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"ecb2ffa0215951bd5b68430a9bc7995140a05fff24a8628b6b8cb7567e4ca4d8"} Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.168146 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" event={"ID":"5d3d6557-5b19-47c3-9e81-09b8dee3b239","Type":"ContainerStarted","Data":"ad98ce5aaf450a211c769bbcf26d2bc9f218e7a3e5f4b71d8e217eb142b2ca51"} Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.395738 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2dwrk"] Jan 30 08:23:04 crc kubenswrapper[4870]: W0130 08:23:04.401319 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c43bdb_2bfa_445b_9526_a03eb3f3ca20.slice/crio-21aa7840d46a8d8787a99c8ae73d2ccf695c014bd734ffff4dcdb62190100de7 WatchSource:0}: Error finding container 21aa7840d46a8d8787a99c8ae73d2ccf695c014bd734ffff4dcdb62190100de7: Status 404 returned error can't find the container with id 21aa7840d46a8d8787a99c8ae73d2ccf695c014bd734ffff4dcdb62190100de7 Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.891646 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.900450 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84099c66-a13e-4949-ae36-7fa85a6a6a56-memberlist\") pod \"speaker-7q5pn\" (UID: \"84099c66-a13e-4949-ae36-7fa85a6a6a56\") " pod="metallb-system/speaker-7q5pn" Jan 30 08:23:04 crc kubenswrapper[4870]: I0130 08:23:04.952997 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7q5pn" Jan 30 08:23:04 crc kubenswrapper[4870]: W0130 08:23:04.980348 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84099c66_a13e_4949_ae36_7fa85a6a6a56.slice/crio-85e4c1cf49f7964f2023663bbe40b88939b1fa679ece7a89b6a3698459539c7c WatchSource:0}: Error finding container 85e4c1cf49f7964f2023663bbe40b88939b1fa679ece7a89b6a3698459539c7c: Status 404 returned error can't find the container with id 85e4c1cf49f7964f2023663bbe40b88939b1fa679ece7a89b6a3698459539c7c Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.186941 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2dwrk" event={"ID":"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20","Type":"ContainerStarted","Data":"3d5c87be1abee24970012c8766e657f259c3ff7ee82cb8e4682fee1e6545daf3"} Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.187003 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2dwrk" event={"ID":"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20","Type":"ContainerStarted","Data":"c48c352b27effa7cbb1874f70695798e077923427c77846470ddc075dbfa8dea"} Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.187022 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2dwrk" event={"ID":"b8c43bdb-2bfa-445b-9526-a03eb3f3ca20","Type":"ContainerStarted","Data":"21aa7840d46a8d8787a99c8ae73d2ccf695c014bd734ffff4dcdb62190100de7"} Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.188346 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.205793 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7q5pn" event={"ID":"84099c66-a13e-4949-ae36-7fa85a6a6a56","Type":"ContainerStarted","Data":"85e4c1cf49f7964f2023663bbe40b88939b1fa679ece7a89b6a3698459539c7c"} Jan 30 08:23:05 crc kubenswrapper[4870]: I0130 08:23:05.229979 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-2dwrk" podStartSLOduration=2.229963615 podStartE2EDuration="2.229963615s" podCreationTimestamp="2026-01-30 08:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:23:05.227457857 +0000 UTC m=+823.923004966" watchObservedRunningTime="2026-01-30 08:23:05.229963615 +0000 UTC m=+823.925510724" Jan 30 08:23:06 crc kubenswrapper[4870]: I0130 08:23:06.242106 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7q5pn" event={"ID":"84099c66-a13e-4949-ae36-7fa85a6a6a56","Type":"ContainerStarted","Data":"f5e3a2f7ff44e533b19e51c89160d612b7d96a53b8be859640b1eb341b44a9af"} Jan 30 08:23:06 crc kubenswrapper[4870]: I0130 08:23:06.246603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7q5pn" event={"ID":"84099c66-a13e-4949-ae36-7fa85a6a6a56","Type":"ContainerStarted","Data":"abaf353f709893509b53cd49a35ad8650090d541b7050cd5cc67e814e0605ad6"} Jan 30 08:23:06 crc kubenswrapper[4870]: I0130 08:23:06.246628 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7q5pn" Jan 30 08:23:06 crc kubenswrapper[4870]: I0130 08:23:06.273072 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7q5pn" podStartSLOduration=3.273046584 podStartE2EDuration="3.273046584s" podCreationTimestamp="2026-01-30 08:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:23:06.262789673 +0000 UTC m=+824.958336782" watchObservedRunningTime="2026-01-30 08:23:06.273046584 +0000 UTC m=+824.968593693" Jan 30 08:23:12 crc kubenswrapper[4870]: I0130 08:23:12.321124 4870 generic.go:334] "Generic (PLEG): container finished" podID="008f589d-dab4-42af-9a42-cb6c00737f44" containerID="80cbadd69312293fca500c2c971a875ea76ef3bdb2f21bf6a70ff5ad0a4f6a30" exitCode=0 Jan 30 08:23:12 crc kubenswrapper[4870]: I0130 08:23:12.321292 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerDied","Data":"80cbadd69312293fca500c2c971a875ea76ef3bdb2f21bf6a70ff5ad0a4f6a30"} Jan 30 08:23:12 crc kubenswrapper[4870]: I0130 08:23:12.325830 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" event={"ID":"5d3d6557-5b19-47c3-9e81-09b8dee3b239","Type":"ContainerStarted","Data":"b5d9070f6ddcf0d0673a8ea448988136752b8a01ebb223d220e0464931ee2560"} Jan 30 08:23:12 crc kubenswrapper[4870]: I0130 08:23:12.326142 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:12 crc kubenswrapper[4870]: I0130 08:23:12.400170 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" podStartSLOduration=1.386660182 podStartE2EDuration="9.400125134s" podCreationTimestamp="2026-01-30 08:23:03 +0000 UTC" firstStartedPulling="2026-01-30 08:23:03.779133386 +0000 UTC m=+822.474680485" lastFinishedPulling="2026-01-30 08:23:11.792598328 +0000 UTC m=+830.488145437" observedRunningTime="2026-01-30 08:23:12.392720731 +0000 UTC m=+831.088267870" watchObservedRunningTime="2026-01-30 08:23:12.400125134 +0000 UTC m=+831.095672273" Jan 30 08:23:13 crc kubenswrapper[4870]: I0130 08:23:13.339571 4870 generic.go:334] "Generic (PLEG): container finished" podID="008f589d-dab4-42af-9a42-cb6c00737f44" containerID="198af9d63563856ccb8d6a572a13ca85cf978f2430ad5c5123d0f79a257559fc" exitCode=0 Jan 30 08:23:13 crc kubenswrapper[4870]: I0130 08:23:13.341454 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerDied","Data":"198af9d63563856ccb8d6a572a13ca85cf978f2430ad5c5123d0f79a257559fc"} Jan 30 08:23:14 crc kubenswrapper[4870]: I0130 08:23:14.125354 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-2dwrk" Jan 30 08:23:14 crc kubenswrapper[4870]: I0130 08:23:14.348985 4870 generic.go:334] "Generic (PLEG): container finished" podID="008f589d-dab4-42af-9a42-cb6c00737f44" containerID="85498cf689e4c6dc227af410f98c3d3dbcdfc33806253794730e19a8da67ed36" exitCode=0 Jan 30 08:23:14 crc kubenswrapper[4870]: I0130 08:23:14.349034 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerDied","Data":"85498cf689e4c6dc227af410f98c3d3dbcdfc33806253794730e19a8da67ed36"} Jan 30 08:23:15 crc kubenswrapper[4870]: I0130 08:23:15.366975 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"80d5a89ce3d77979dfd983dff4ec762a001f3cdc86340cb3e3c2005d602d6d60"} Jan 30 08:23:15 crc kubenswrapper[4870]: I0130 08:23:15.367433 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"c33a2a2501e909f95713117af712fcc9abab5a57e7972491254d6b729622c353"} Jan 30 08:23:15 crc kubenswrapper[4870]: I0130 08:23:15.367453 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"ab2d48e04b8b9d44659b0227037a829e8ed5c20ebfb3d9502a2c7000338fc051"} Jan 30 08:23:15 crc kubenswrapper[4870]: I0130 08:23:15.367471 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"60fb3eb637725398e2b738818807814acdc09a14628bd0ee1e8ce3900f9231a8"} Jan 30 08:23:15 crc kubenswrapper[4870]: I0130 08:23:15.367488 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"7ad5e234345a49675c16ae95e5800d919b2fc4f07402c7d3e128db23c50c6726"} Jan 30 08:23:16 crc kubenswrapper[4870]: I0130 08:23:16.382986 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zwhkv" event={"ID":"008f589d-dab4-42af-9a42-cb6c00737f44","Type":"ContainerStarted","Data":"b72cb05bacf644fbbf2323975c9d7ece66c9cc42ea6ce09fec59bee5d456e473"} Jan 30 08:23:16 crc kubenswrapper[4870]: I0130 08:23:16.383626 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:16 crc kubenswrapper[4870]: I0130 08:23:16.424313 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zwhkv" podStartSLOduration=6.753964155 podStartE2EDuration="14.424280007s" podCreationTimestamp="2026-01-30 08:23:02 +0000 UTC" firstStartedPulling="2026-01-30 08:23:04.121101469 +0000 UTC m=+822.816648608" lastFinishedPulling="2026-01-30 08:23:11.791417351 +0000 UTC m=+830.486964460" observedRunningTime="2026-01-30 08:23:16.417996519 +0000 UTC m=+835.113543678" watchObservedRunningTime="2026-01-30 08:23:16.424280007 +0000 UTC m=+835.119827156" Jan 30 08:23:18 crc kubenswrapper[4870]: I0130 08:23:18.929384 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:19 crc kubenswrapper[4870]: I0130 08:23:19.009885 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:23 crc kubenswrapper[4870]: I0130 08:23:23.350529 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-pnddd" Jan 30 08:23:24 crc kubenswrapper[4870]: I0130 08:23:24.956868 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7q5pn" Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.956606 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.957915 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.972987 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.973354 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9lk7j" Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.973781 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 08:23:27 crc kubenswrapper[4870]: I0130 08:23:27.990567 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:28 crc kubenswrapper[4870]: I0130 08:23:28.081379 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjxk\" (UniqueName: \"kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk\") pod \"openstack-operator-index-vk67k\" (UID: \"7d303d04-4cbe-4ca5-b134-f2f8312c227d\") " pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:28 crc kubenswrapper[4870]: I0130 08:23:28.183567 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stjxk\" (UniqueName: \"kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk\") pod \"openstack-operator-index-vk67k\" (UID: \"7d303d04-4cbe-4ca5-b134-f2f8312c227d\") " pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:28 crc kubenswrapper[4870]: I0130 08:23:28.212999 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjxk\" (UniqueName: \"kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk\") pod \"openstack-operator-index-vk67k\" (UID: \"7d303d04-4cbe-4ca5-b134-f2f8312c227d\") " pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:28 crc kubenswrapper[4870]: I0130 08:23:28.275806 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:28 crc kubenswrapper[4870]: I0130 08:23:28.527280 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:29 crc kubenswrapper[4870]: I0130 08:23:29.510151 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vk67k" event={"ID":"7d303d04-4cbe-4ca5-b134-f2f8312c227d","Type":"ContainerStarted","Data":"4dd9c22d1339c45953edde7fd3b89d1a6b9500ddda6ad95e56185d56d08f2af7"} Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.114688 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.723842 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4bccf"] Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.727440 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.750598 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4bccf"] Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.839394 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txn25\" (UniqueName: \"kubernetes.io/projected/c79c7300-5362-40dc-a952-2193e7a6908b-kube-api-access-txn25\") pod \"openstack-operator-index-4bccf\" (UID: \"c79c7300-5362-40dc-a952-2193e7a6908b\") " pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.940834 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txn25\" (UniqueName: \"kubernetes.io/projected/c79c7300-5362-40dc-a952-2193e7a6908b-kube-api-access-txn25\") pod \"openstack-operator-index-4bccf\" (UID: \"c79c7300-5362-40dc-a952-2193e7a6908b\") " pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:31 crc kubenswrapper[4870]: I0130 08:23:31.979606 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txn25\" (UniqueName: \"kubernetes.io/projected/c79c7300-5362-40dc-a952-2193e7a6908b-kube-api-access-txn25\") pod \"openstack-operator-index-4bccf\" (UID: \"c79c7300-5362-40dc-a952-2193e7a6908b\") " pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.055667 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.325591 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4bccf"] Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.547944 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vk67k" event={"ID":"7d303d04-4cbe-4ca5-b134-f2f8312c227d","Type":"ContainerStarted","Data":"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205"} Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.548189 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vk67k" podUID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" containerName="registry-server" containerID="cri-o://16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205" gracePeriod=2 Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.553070 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4bccf" event={"ID":"c79c7300-5362-40dc-a952-2193e7a6908b","Type":"ContainerStarted","Data":"2f9a22a68968c2a49c744d38054cc9daa6f102fad302a50844154ba07d842499"} Jan 30 08:23:32 crc kubenswrapper[4870]: I0130 08:23:32.572494 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vk67k" podStartSLOduration=2.333599409 podStartE2EDuration="5.572456501s" podCreationTimestamp="2026-01-30 08:23:27 +0000 UTC" firstStartedPulling="2026-01-30 08:23:28.538248762 +0000 UTC m=+847.233795871" lastFinishedPulling="2026-01-30 08:23:31.777105854 +0000 UTC m=+850.472652963" observedRunningTime="2026-01-30 08:23:32.569839439 +0000 UTC m=+851.265386558" watchObservedRunningTime="2026-01-30 08:23:32.572456501 +0000 UTC m=+851.268003980" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.056377 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.160202 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stjxk\" (UniqueName: \"kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk\") pod \"7d303d04-4cbe-4ca5-b134-f2f8312c227d\" (UID: \"7d303d04-4cbe-4ca5-b134-f2f8312c227d\") " Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.170843 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk" (OuterVolumeSpecName: "kube-api-access-stjxk") pod "7d303d04-4cbe-4ca5-b134-f2f8312c227d" (UID: "7d303d04-4cbe-4ca5-b134-f2f8312c227d"). InnerVolumeSpecName "kube-api-access-stjxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.262588 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stjxk\" (UniqueName: \"kubernetes.io/projected/7d303d04-4cbe-4ca5-b134-f2f8312c227d-kube-api-access-stjxk\") on node \"crc\" DevicePath \"\"" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.564474 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4bccf" event={"ID":"c79c7300-5362-40dc-a952-2193e7a6908b","Type":"ContainerStarted","Data":"439e79c69ead461c3d52bd6167951926e409dcda33f4f96731f074ca8e53b78d"} Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.567450 4870 generic.go:334] "Generic (PLEG): container finished" podID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" containerID="16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205" exitCode=0 Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.567525 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vk67k" event={"ID":"7d303d04-4cbe-4ca5-b134-f2f8312c227d","Type":"ContainerDied","Data":"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205"} Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.567589 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vk67k" event={"ID":"7d303d04-4cbe-4ca5-b134-f2f8312c227d","Type":"ContainerDied","Data":"4dd9c22d1339c45953edde7fd3b89d1a6b9500ddda6ad95e56185d56d08f2af7"} Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.567620 4870 scope.go:117] "RemoveContainer" containerID="16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.567861 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vk67k" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.600368 4870 scope.go:117] "RemoveContainer" containerID="16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205" Jan 30 08:23:33 crc kubenswrapper[4870]: E0130 08:23:33.600859 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205\": container with ID starting with 16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205 not found: ID does not exist" containerID="16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.600948 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205"} err="failed to get container status \"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205\": rpc error: code = NotFound desc = could not find container \"16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205\": container with ID starting with 16a6a78d6c093d90f00b838a5288df35c2f1a86506f33234e1ab0bc9a5df8205 not found: ID does not exist" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.602050 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4bccf" podStartSLOduration=2.5446929750000002 podStartE2EDuration="2.602030206s" podCreationTimestamp="2026-01-30 08:23:31 +0000 UTC" firstStartedPulling="2026-01-30 08:23:32.340615998 +0000 UTC m=+851.036163107" lastFinishedPulling="2026-01-30 08:23:32.397953229 +0000 UTC m=+851.093500338" observedRunningTime="2026-01-30 08:23:33.595347306 +0000 UTC m=+852.290894455" watchObservedRunningTime="2026-01-30 08:23:33.602030206 +0000 UTC m=+852.297577355" Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.641363 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.650006 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vk67k"] Jan 30 08:23:33 crc kubenswrapper[4870]: I0130 08:23:33.938684 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zwhkv" Jan 30 08:23:34 crc kubenswrapper[4870]: I0130 08:23:34.093989 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" path="/var/lib/kubelet/pods/7d303d04-4cbe-4ca5-b134-f2f8312c227d/volumes" Jan 30 08:23:42 crc kubenswrapper[4870]: I0130 08:23:42.056252 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:42 crc kubenswrapper[4870]: I0130 08:23:42.056838 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:42 crc kubenswrapper[4870]: I0130 08:23:42.124859 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:42 crc kubenswrapper[4870]: I0130 08:23:42.695461 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4bccf" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.172139 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw"] Jan 30 08:23:43 crc kubenswrapper[4870]: E0130 08:23:43.172420 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" containerName="registry-server" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.172432 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" containerName="registry-server" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.172551 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d303d04-4cbe-4ca5-b134-f2f8312c227d" containerName="registry-server" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.173435 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.177127 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kcgg7" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.182431 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw"] Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.338696 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.339687 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.339737 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8tbg\" (UniqueName: \"kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.441033 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.441095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.441128 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8tbg\" (UniqueName: \"kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.441831 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.442039 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.466766 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8tbg\" (UniqueName: \"kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg\") pod \"538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.493730 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:43 crc kubenswrapper[4870]: I0130 08:23:43.767187 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw"] Jan 30 08:23:44 crc kubenswrapper[4870]: E0130 08:23:44.094144 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3c7406_d095_434b_a79a_f24373a9b141.slice/crio-fbe751daab30c254bc9548e5b9ad99cb96f0f442bdf9a0bca64985b96e1e512f.scope\": RecentStats: unable to find data in memory cache]" Jan 30 08:23:44 crc kubenswrapper[4870]: I0130 08:23:44.678090 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f3c7406-d095-434b-a79a-f24373a9b141" containerID="fbe751daab30c254bc9548e5b9ad99cb96f0f442bdf9a0bca64985b96e1e512f" exitCode=0 Jan 30 08:23:44 crc kubenswrapper[4870]: I0130 08:23:44.678207 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" event={"ID":"6f3c7406-d095-434b-a79a-f24373a9b141","Type":"ContainerDied","Data":"fbe751daab30c254bc9548e5b9ad99cb96f0f442bdf9a0bca64985b96e1e512f"} Jan 30 08:23:44 crc kubenswrapper[4870]: I0130 08:23:44.678561 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" event={"ID":"6f3c7406-d095-434b-a79a-f24373a9b141","Type":"ContainerStarted","Data":"8d58849cc2eca4dbbd8aedc621bd2b4682a82f44b1d861543780b17030ac22a6"} Jan 30 08:23:45 crc kubenswrapper[4870]: I0130 08:23:45.691241 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f3c7406-d095-434b-a79a-f24373a9b141" containerID="83f08d103cec33c1c080b7171ebb52042419b4ce73f6c86cb79eb07b86f85255" exitCode=0 Jan 30 08:23:45 crc kubenswrapper[4870]: I0130 08:23:45.691330 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" event={"ID":"6f3c7406-d095-434b-a79a-f24373a9b141","Type":"ContainerDied","Data":"83f08d103cec33c1c080b7171ebb52042419b4ce73f6c86cb79eb07b86f85255"} Jan 30 08:23:46 crc kubenswrapper[4870]: I0130 08:23:46.702454 4870 generic.go:334] "Generic (PLEG): container finished" podID="6f3c7406-d095-434b-a79a-f24373a9b141" containerID="73a62a4577b0873fb9119073cd13c8c85100124b2cc27968fc48f710ab2d3107" exitCode=0 Jan 30 08:23:46 crc kubenswrapper[4870]: I0130 08:23:46.702540 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" event={"ID":"6f3c7406-d095-434b-a79a-f24373a9b141","Type":"ContainerDied","Data":"73a62a4577b0873fb9119073cd13c8c85100124b2cc27968fc48f710ab2d3107"} Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.157366 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.218159 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util\") pod \"6f3c7406-d095-434b-a79a-f24373a9b141\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.219104 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8tbg\" (UniqueName: \"kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg\") pod \"6f3c7406-d095-434b-a79a-f24373a9b141\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.219290 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle\") pod \"6f3c7406-d095-434b-a79a-f24373a9b141\" (UID: \"6f3c7406-d095-434b-a79a-f24373a9b141\") " Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.220228 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle" (OuterVolumeSpecName: "bundle") pod "6f3c7406-d095-434b-a79a-f24373a9b141" (UID: "6f3c7406-d095-434b-a79a-f24373a9b141"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.220741 4870 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.228277 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg" (OuterVolumeSpecName: "kube-api-access-b8tbg") pod "6f3c7406-d095-434b-a79a-f24373a9b141" (UID: "6f3c7406-d095-434b-a79a-f24373a9b141"). InnerVolumeSpecName "kube-api-access-b8tbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.242883 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util" (OuterVolumeSpecName: "util") pod "6f3c7406-d095-434b-a79a-f24373a9b141" (UID: "6f3c7406-d095-434b-a79a-f24373a9b141"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.322719 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8tbg\" (UniqueName: \"kubernetes.io/projected/6f3c7406-d095-434b-a79a-f24373a9b141-kube-api-access-b8tbg\") on node \"crc\" DevicePath \"\"" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.322774 4870 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f3c7406-d095-434b-a79a-f24373a9b141-util\") on node \"crc\" DevicePath \"\"" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.730199 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" event={"ID":"6f3c7406-d095-434b-a79a-f24373a9b141","Type":"ContainerDied","Data":"8d58849cc2eca4dbbd8aedc621bd2b4682a82f44b1d861543780b17030ac22a6"} Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.730260 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d58849cc2eca4dbbd8aedc621bd2b4682a82f44b1d861543780b17030ac22a6" Jan 30 08:23:48 crc kubenswrapper[4870]: I0130 08:23:48.730376 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.724764 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd"] Jan 30 08:23:55 crc kubenswrapper[4870]: E0130 08:23:55.725828 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="pull" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.725845 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="pull" Jan 30 08:23:55 crc kubenswrapper[4870]: E0130 08:23:55.725881 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="util" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.725909 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="util" Jan 30 08:23:55 crc kubenswrapper[4870]: E0130 08:23:55.725923 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="extract" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.725933 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="extract" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.726078 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3c7406-d095-434b-a79a-f24373a9b141" containerName="extract" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.726679 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.729350 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-htm2c" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.753794 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd"] Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.841471 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9j4z\" (UniqueName: \"kubernetes.io/projected/b5c8b38a-bdec-4120-9802-5a35815eca01-kube-api-access-w9j4z\") pod \"openstack-operator-controller-init-594f7f44c-vnpnd\" (UID: \"b5c8b38a-bdec-4120-9802-5a35815eca01\") " pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.942792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9j4z\" (UniqueName: \"kubernetes.io/projected/b5c8b38a-bdec-4120-9802-5a35815eca01-kube-api-access-w9j4z\") pod \"openstack-operator-controller-init-594f7f44c-vnpnd\" (UID: \"b5c8b38a-bdec-4120-9802-5a35815eca01\") " pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:23:55 crc kubenswrapper[4870]: I0130 08:23:55.961424 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9j4z\" (UniqueName: \"kubernetes.io/projected/b5c8b38a-bdec-4120-9802-5a35815eca01-kube-api-access-w9j4z\") pod \"openstack-operator-controller-init-594f7f44c-vnpnd\" (UID: \"b5c8b38a-bdec-4120-9802-5a35815eca01\") " pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:23:56 crc kubenswrapper[4870]: I0130 08:23:56.051286 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:23:56 crc kubenswrapper[4870]: I0130 08:23:56.365607 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd"] Jan 30 08:23:56 crc kubenswrapper[4870]: I0130 08:23:56.793074 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" event={"ID":"b5c8b38a-bdec-4120-9802-5a35815eca01","Type":"ContainerStarted","Data":"3bbc9db6a3a9845b1560190ce5b6bcc662aa58e84baca5090946339b19e29fa8"} Jan 30 08:24:01 crc kubenswrapper[4870]: I0130 08:24:01.834131 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" event={"ID":"b5c8b38a-bdec-4120-9802-5a35815eca01","Type":"ContainerStarted","Data":"0f0b4cb65752c371320e8731349c6678583cf1a80bf7ad6418a69328cac6897c"} Jan 30 08:24:01 crc kubenswrapper[4870]: I0130 08:24:01.835082 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:24:01 crc kubenswrapper[4870]: I0130 08:24:01.891968 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" podStartSLOduration=2.3779210109999998 podStartE2EDuration="6.891871322s" podCreationTimestamp="2026-01-30 08:23:55 +0000 UTC" firstStartedPulling="2026-01-30 08:23:56.371324108 +0000 UTC m=+875.066871237" lastFinishedPulling="2026-01-30 08:24:00.885274399 +0000 UTC m=+879.580821548" observedRunningTime="2026-01-30 08:24:01.88003965 +0000 UTC m=+880.575586799" watchObservedRunningTime="2026-01-30 08:24:01.891871322 +0000 UTC m=+880.587418461" Jan 30 08:24:06 crc kubenswrapper[4870]: I0130 08:24:06.057845 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-594f7f44c-vnpnd" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.765425 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.767117 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.768929 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-g8h7z" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.772589 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.773540 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.776790 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7764j" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.783999 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.806844 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.812120 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.813279 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.815170 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5th8n" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.818812 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.819887 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.821797 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gzpfd" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.828931 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqw9\" (UniqueName: \"kubernetes.io/projected/54c01287-d66d-46bc-bbb8-7532263099c5-kube-api-access-5cqw9\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-wfpg9\" (UID: \"54c01287-d66d-46bc-bbb8-7532263099c5\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.828979 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvmk\" (UniqueName: \"kubernetes.io/projected/96be73fb-f1fc-4c5c-a643-7b9dcc832ac6-kube-api-access-vgvmk\") pod \"glance-operator-controller-manager-8886f4c47-tkrpg\" (UID: \"96be73fb-f1fc-4c5c-a643-7b9dcc832ac6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.829323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mh2k\" (UniqueName: \"kubernetes.io/projected/dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb-kube-api-access-4mh2k\") pod \"designate-operator-controller-manager-6d9697b7f4-grbz8\" (UID: \"dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.829422 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrpc\" (UniqueName: \"kubernetes.io/projected/e973c5f3-3291-4d4b-85ce-806ef6f83c1a-kube-api-access-wmrpc\") pod \"cinder-operator-controller-manager-8d874c8fc-hsfpq\" (UID: \"e973c5f3-3291-4d4b-85ce-806ef6f83c1a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.847281 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.853408 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.893025 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.894634 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.898552 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jtj66" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.910006 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.911517 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.916121 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pnprh" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.928980 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931348 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrpc\" (UniqueName: \"kubernetes.io/projected/e973c5f3-3291-4d4b-85ce-806ef6f83c1a-kube-api-access-wmrpc\") pod \"cinder-operator-controller-manager-8d874c8fc-hsfpq\" (UID: \"e973c5f3-3291-4d4b-85ce-806ef6f83c1a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931441 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqw9\" (UniqueName: \"kubernetes.io/projected/54c01287-d66d-46bc-bbb8-7532263099c5-kube-api-access-5cqw9\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-wfpg9\" (UID: \"54c01287-d66d-46bc-bbb8-7532263099c5\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931477 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvmk\" (UniqueName: \"kubernetes.io/projected/96be73fb-f1fc-4c5c-a643-7b9dcc832ac6-kube-api-access-vgvmk\") pod \"glance-operator-controller-manager-8886f4c47-tkrpg\" (UID: \"96be73fb-f1fc-4c5c-a643-7b9dcc832ac6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931533 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrl7\" (UniqueName: \"kubernetes.io/projected/925313c0-6800-4a27-814b-887b46cf49ad-kube-api-access-tgrl7\") pod \"horizon-operator-controller-manager-5fb775575f-hbmf7\" (UID: \"925313c0-6800-4a27-814b-887b46cf49ad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931603 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mh2k\" (UniqueName: \"kubernetes.io/projected/dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb-kube-api-access-4mh2k\") pod \"designate-operator-controller-manager-6d9697b7f4-grbz8\" (UID: \"dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.931633 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzglw\" (UniqueName: \"kubernetes.io/projected/b9449ead-e087-4895-a88a-8bdfe0835ebd-kube-api-access-rzglw\") pod \"heat-operator-controller-manager-69d6db494d-wlkxq\" (UID: \"b9449ead-e087-4895-a88a-8bdfe0835ebd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.933836 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.948940 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-spzcf"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.950079 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.952341 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.952678 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pppwc" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.961487 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.962536 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-spzcf"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.962647 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.969314 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-znfz5" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.971687 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst"] Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.972803 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mh2k\" (UniqueName: \"kubernetes.io/projected/dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb-kube-api-access-4mh2k\") pod \"designate-operator-controller-manager-6d9697b7f4-grbz8\" (UID: \"dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.973001 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.975031 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqw9\" (UniqueName: \"kubernetes.io/projected/54c01287-d66d-46bc-bbb8-7532263099c5-kube-api-access-5cqw9\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-wfpg9\" (UID: \"54c01287-d66d-46bc-bbb8-7532263099c5\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:26 crc kubenswrapper[4870]: I0130 08:24:26.991941 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mnlnz" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.004678 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.006858 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvmk\" (UniqueName: \"kubernetes.io/projected/96be73fb-f1fc-4c5c-a643-7b9dcc832ac6-kube-api-access-vgvmk\") pod \"glance-operator-controller-manager-8886f4c47-tkrpg\" (UID: \"96be73fb-f1fc-4c5c-a643-7b9dcc832ac6\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.009300 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.010180 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.011562 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrpc\" (UniqueName: \"kubernetes.io/projected/e973c5f3-3291-4d4b-85ce-806ef6f83c1a-kube-api-access-wmrpc\") pod \"cinder-operator-controller-manager-8d874c8fc-hsfpq\" (UID: \"e973c5f3-3291-4d4b-85ce-806ef6f83c1a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.017767 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pbfgq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.033337 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrl7\" (UniqueName: \"kubernetes.io/projected/925313c0-6800-4a27-814b-887b46cf49ad-kube-api-access-tgrl7\") pod \"horizon-operator-controller-manager-5fb775575f-hbmf7\" (UID: \"925313c0-6800-4a27-814b-887b46cf49ad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.033449 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzglw\" (UniqueName: \"kubernetes.io/projected/b9449ead-e087-4895-a88a-8bdfe0835ebd-kube-api-access-rzglw\") pod \"heat-operator-controller-manager-69d6db494d-wlkxq\" (UID: \"b9449ead-e087-4895-a88a-8bdfe0835ebd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.053982 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.062035 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzglw\" (UniqueName: \"kubernetes.io/projected/b9449ead-e087-4895-a88a-8bdfe0835ebd-kube-api-access-rzglw\") pod \"heat-operator-controller-manager-69d6db494d-wlkxq\" (UID: \"b9449ead-e087-4895-a88a-8bdfe0835ebd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.066576 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.078133 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrl7\" (UniqueName: \"kubernetes.io/projected/925313c0-6800-4a27-814b-887b46cf49ad-kube-api-access-tgrl7\") pod \"horizon-operator-controller-manager-5fb775575f-hbmf7\" (UID: \"925313c0-6800-4a27-814b-887b46cf49ad\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.083499 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.084509 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.088070 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6mztf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.096133 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.109357 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.109405 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.115492 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.116292 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.124333 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7cjsm" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.138284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shhs\" (UniqueName: \"kubernetes.io/projected/db7aeba5-92f5-4887-9a6a-92d8c57650d2-kube-api-access-7shhs\") pod \"keystone-operator-controller-manager-84f48565d4-rhfst\" (UID: \"db7aeba5-92f5-4887-9a6a-92d8c57650d2\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.138391 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b429\" (UniqueName: \"kubernetes.io/projected/ea3efedd-cb74-48c7-b246-b188bac37ed4-kube-api-access-5b429\") pod \"mariadb-operator-controller-manager-67bf948998-59rt2\" (UID: \"ea3efedd-cb74-48c7-b246-b188bac37ed4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.138441 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lt4t\" (UniqueName: \"kubernetes.io/projected/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-kube-api-access-5lt4t\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.138460 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8fb\" (UniqueName: \"kubernetes.io/projected/5680ceb3-f5ec-4d9e-a313-13564402bff2-kube-api-access-hw8fb\") pod \"ironic-operator-controller-manager-5f4b8bd54d-5vfrj\" (UID: \"5680ceb3-f5ec-4d9e-a313-13564402bff2\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.138496 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.152503 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.157369 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.159639 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.169087 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-55q7v" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.169535 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.218823 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.238750 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.264596 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lt4t\" (UniqueName: \"kubernetes.io/projected/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-kube-api-access-5lt4t\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.264806 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8fb\" (UniqueName: \"kubernetes.io/projected/5680ceb3-f5ec-4d9e-a313-13564402bff2-kube-api-access-hw8fb\") pod \"ironic-operator-controller-manager-5f4b8bd54d-5vfrj\" (UID: \"5680ceb3-f5ec-4d9e-a313-13564402bff2\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.265098 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.265199 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shhs\" (UniqueName: \"kubernetes.io/projected/db7aeba5-92f5-4887-9a6a-92d8c57650d2-kube-api-access-7shhs\") pod \"keystone-operator-controller-manager-84f48565d4-rhfst\" (UID: \"db7aeba5-92f5-4887-9a6a-92d8c57650d2\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.272427 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlbfg\" (UniqueName: \"kubernetes.io/projected/0ea209e2-96bf-4919-ad8f-f86de2b78ab1-kube-api-access-jlbfg\") pod \"neutron-operator-controller-manager-585dbc889-2xdfh\" (UID: \"0ea209e2-96bf-4919-ad8f-f86de2b78ab1\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.274703 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68fs\" (UniqueName: \"kubernetes.io/projected/5cde6cc5-f427-4349-8c8a-3dce0deac5a9-kube-api-access-h68fs\") pod \"manila-operator-controller-manager-7dd968899f-j9bdn\" (UID: \"5cde6cc5-f427-4349-8c8a-3dce0deac5a9\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.278280 4870 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.278869 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert podName:46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:27.778721918 +0000 UTC m=+906.474269027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert") pod "infra-operator-controller-manager-79955696d6-spzcf" (UID: "46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05") : secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.282924 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b429\" (UniqueName: \"kubernetes.io/projected/ea3efedd-cb74-48c7-b246-b188bac37ed4-kube-api-access-5b429\") pod \"mariadb-operator-controller-manager-67bf948998-59rt2\" (UID: \"ea3efedd-cb74-48c7-b246-b188bac37ed4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.307205 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.313420 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8fb\" (UniqueName: \"kubernetes.io/projected/5680ceb3-f5ec-4d9e-a313-13564402bff2-kube-api-access-hw8fb\") pod \"ironic-operator-controller-manager-5f4b8bd54d-5vfrj\" (UID: \"5680ceb3-f5ec-4d9e-a313-13564402bff2\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.332989 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shhs\" (UniqueName: \"kubernetes.io/projected/db7aeba5-92f5-4887-9a6a-92d8c57650d2-kube-api-access-7shhs\") pod \"keystone-operator-controller-manager-84f48565d4-rhfst\" (UID: \"db7aeba5-92f5-4887-9a6a-92d8c57650d2\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.335525 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b429\" (UniqueName: \"kubernetes.io/projected/ea3efedd-cb74-48c7-b246-b188bac37ed4-kube-api-access-5b429\") pod \"mariadb-operator-controller-manager-67bf948998-59rt2\" (UID: \"ea3efedd-cb74-48c7-b246-b188bac37ed4\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.414285 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lt4t\" (UniqueName: \"kubernetes.io/projected/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-kube-api-access-5lt4t\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.420387 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.421390 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.427673 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zlfh7" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.432622 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlbfg\" (UniqueName: \"kubernetes.io/projected/0ea209e2-96bf-4919-ad8f-f86de2b78ab1-kube-api-access-jlbfg\") pod \"neutron-operator-controller-manager-585dbc889-2xdfh\" (UID: \"0ea209e2-96bf-4919-ad8f-f86de2b78ab1\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.432704 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h68fs\" (UniqueName: \"kubernetes.io/projected/5cde6cc5-f427-4349-8c8a-3dce0deac5a9-kube-api-access-h68fs\") pod \"manila-operator-controller-manager-7dd968899f-j9bdn\" (UID: \"5cde6cc5-f427-4349-8c8a-3dce0deac5a9\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.432745 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v65mv\" (UniqueName: \"kubernetes.io/projected/604ff246-0f47-4c2c-8940-d76f10dce14e-kube-api-access-v65mv\") pod \"nova-operator-controller-manager-55bff696bd-cpn6f\" (UID: \"604ff246-0f47-4c2c-8940-d76f10dce14e\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.433052 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.479683 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.487609 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.494832 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68fs\" (UniqueName: \"kubernetes.io/projected/5cde6cc5-f427-4349-8c8a-3dce0deac5a9-kube-api-access-h68fs\") pod \"manila-operator-controller-manager-7dd968899f-j9bdn\" (UID: \"5cde6cc5-f427-4349-8c8a-3dce0deac5a9\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.513513 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.530611 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.531752 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.533690 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v65mv\" (UniqueName: \"kubernetes.io/projected/604ff246-0f47-4c2c-8940-d76f10dce14e-kube-api-access-v65mv\") pod \"nova-operator-controller-manager-55bff696bd-cpn6f\" (UID: \"604ff246-0f47-4c2c-8940-d76f10dce14e\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.533764 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnhn\" (UniqueName: \"kubernetes.io/projected/be7a26e3-9284-4316-bce7-7bc15c9178bd-kube-api-access-mlnhn\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.533807 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.533839 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmmmc\" (UniqueName: \"kubernetes.io/projected/2ee622d2-acd4-4eec-9fbb-12b5bae7e32f-kube-api-access-nmmmc\") pod \"octavia-operator-controller-manager-6687f8d877-4sftq\" (UID: \"2ee622d2-acd4-4eec-9fbb-12b5bae7e32f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.541188 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rwmjx" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.541455 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.543009 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.548070 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.549198 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.551717 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7jfpq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.556264 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlbfg\" (UniqueName: \"kubernetes.io/projected/0ea209e2-96bf-4919-ad8f-f86de2b78ab1-kube-api-access-jlbfg\") pod \"neutron-operator-controller-manager-585dbc889-2xdfh\" (UID: \"0ea209e2-96bf-4919-ad8f-f86de2b78ab1\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.557667 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.565654 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.566031 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.566585 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.568251 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v65mv\" (UniqueName: \"kubernetes.io/projected/604ff246-0f47-4c2c-8940-d76f10dce14e-kube-api-access-v65mv\") pod \"nova-operator-controller-manager-55bff696bd-cpn6f\" (UID: \"604ff246-0f47-4c2c-8940-d76f10dce14e\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.572989 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vtxs2" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.582619 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-497sn"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.583704 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.588687 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-h2jx7" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.597438 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.602212 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.611103 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.621853 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-497sn"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.632760 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.634206 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.634926 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnhn\" (UniqueName: \"kubernetes.io/projected/be7a26e3-9284-4316-bce7-7bc15c9178bd-kube-api-access-mlnhn\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.634997 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.635039 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmmmc\" (UniqueName: \"kubernetes.io/projected/2ee622d2-acd4-4eec-9fbb-12b5bae7e32f-kube-api-access-nmmmc\") pod \"octavia-operator-controller-manager-6687f8d877-4sftq\" (UID: \"2ee622d2-acd4-4eec-9fbb-12b5bae7e32f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.635243 4870 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.635314 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert podName:be7a26e3-9284-4316-bce7-7bc15c9178bd nodeName:}" failed. No retries permitted until 2026-01-30 08:24:28.135292121 +0000 UTC m=+906.830839230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" (UID: "be7a26e3-9284-4316-bce7-7bc15c9178bd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.638948 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7v7hd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.645984 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.674630 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmmmc\" (UniqueName: \"kubernetes.io/projected/2ee622d2-acd4-4eec-9fbb-12b5bae7e32f-kube-api-access-nmmmc\") pod \"octavia-operator-controller-manager-6687f8d877-4sftq\" (UID: \"2ee622d2-acd4-4eec-9fbb-12b5bae7e32f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.675126 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnhn\" (UniqueName: \"kubernetes.io/projected/be7a26e3-9284-4316-bce7-7bc15c9178bd-kube-api-access-mlnhn\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.678060 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.682810 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.686250 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mnwss" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.704512 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.717422 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.759968 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxxb\" (UniqueName: \"kubernetes.io/projected/2de7363a-3627-42bb-a58f-7bad2e414192-kube-api-access-5hxxb\") pod \"swift-operator-controller-manager-68fc8c869-497sn\" (UID: \"2de7363a-3627-42bb-a58f-7bad2e414192\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.760038 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgr5f\" (UniqueName: \"kubernetes.io/projected/ec9257db-1c02-4160-9c89-7df62f2ce602-kube-api-access-qgr5f\") pod \"ovn-operator-controller-manager-788c46999f-t4hbm\" (UID: \"ec9257db-1c02-4160-9c89-7df62f2ce602\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.760085 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5dq\" (UniqueName: \"kubernetes.io/projected/274d3a56-3caf-4dd2-b122-e3b45a3eec6e-kube-api-access-2q5dq\") pod \"placement-operator-controller-manager-5b964cf4cd-mx5xp\" (UID: \"274d3a56-3caf-4dd2-b122-e3b45a3eec6e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.760195 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdljv\" (UniqueName: \"kubernetes.io/projected/0319ce7f-95ab-4abf-9101-bf436cc74bf4-kube-api-access-zdljv\") pod \"telemetry-operator-controller-manager-64b5b76f97-bmzrd\" (UID: \"0319ce7f-95ab-4abf-9101-bf436cc74bf4\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.760214 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldkm\" (UniqueName: \"kubernetes.io/projected/378c24d4-b8c1-4cd2-a85c-8449aa00ad3e-kube-api-access-tldkm\") pod \"test-operator-controller-manager-56f8bfcd9f-t8ncr\" (UID: \"378c24d4-b8c1-4cd2-a85c-8449aa00ad3e\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.763990 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.765157 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.768863 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2dhf6" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.788730 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.831003 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.875709 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdljv\" (UniqueName: \"kubernetes.io/projected/0319ce7f-95ab-4abf-9101-bf436cc74bf4-kube-api-access-zdljv\") pod \"telemetry-operator-controller-manager-64b5b76f97-bmzrd\" (UID: \"0319ce7f-95ab-4abf-9101-bf436cc74bf4\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.875759 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldkm\" (UniqueName: \"kubernetes.io/projected/378c24d4-b8c1-4cd2-a85c-8449aa00ad3e-kube-api-access-tldkm\") pod \"test-operator-controller-manager-56f8bfcd9f-t8ncr\" (UID: \"378c24d4-b8c1-4cd2-a85c-8449aa00ad3e\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.875831 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxxb\" (UniqueName: \"kubernetes.io/projected/2de7363a-3627-42bb-a58f-7bad2e414192-kube-api-access-5hxxb\") pod \"swift-operator-controller-manager-68fc8c869-497sn\" (UID: \"2de7363a-3627-42bb-a58f-7bad2e414192\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.875854 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgr5f\" (UniqueName: \"kubernetes.io/projected/ec9257db-1c02-4160-9c89-7df62f2ce602-kube-api-access-qgr5f\") pod \"ovn-operator-controller-manager-788c46999f-t4hbm\" (UID: \"ec9257db-1c02-4160-9c89-7df62f2ce602\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.875955 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5dq\" (UniqueName: \"kubernetes.io/projected/274d3a56-3caf-4dd2-b122-e3b45a3eec6e-kube-api-access-2q5dq\") pod \"placement-operator-controller-manager-5b964cf4cd-mx5xp\" (UID: \"274d3a56-3caf-4dd2-b122-e3b45a3eec6e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.876008 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.876149 4870 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: E0130 08:24:27.876201 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert podName:46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:28.876186149 +0000 UTC m=+907.571733258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert") pod "infra-operator-controller-manager-79955696d6-spzcf" (UID: "46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05") : secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.891624 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.892776 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.902407 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.902679 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.902841 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5fr4g" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.942679 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd"] Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.950205 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgr5f\" (UniqueName: \"kubernetes.io/projected/ec9257db-1c02-4160-9c89-7df62f2ce602-kube-api-access-qgr5f\") pod \"ovn-operator-controller-manager-788c46999f-t4hbm\" (UID: \"ec9257db-1c02-4160-9c89-7df62f2ce602\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.951772 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldkm\" (UniqueName: \"kubernetes.io/projected/378c24d4-b8c1-4cd2-a85c-8449aa00ad3e-kube-api-access-tldkm\") pod \"test-operator-controller-manager-56f8bfcd9f-t8ncr\" (UID: \"378c24d4-b8c1-4cd2-a85c-8449aa00ad3e\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.961710 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5dq\" (UniqueName: \"kubernetes.io/projected/274d3a56-3caf-4dd2-b122-e3b45a3eec6e-kube-api-access-2q5dq\") pod \"placement-operator-controller-manager-5b964cf4cd-mx5xp\" (UID: \"274d3a56-3caf-4dd2-b122-e3b45a3eec6e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.964387 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxxb\" (UniqueName: \"kubernetes.io/projected/2de7363a-3627-42bb-a58f-7bad2e414192-kube-api-access-5hxxb\") pod \"swift-operator-controller-manager-68fc8c869-497sn\" (UID: \"2de7363a-3627-42bb-a58f-7bad2e414192\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.964937 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdljv\" (UniqueName: \"kubernetes.io/projected/0319ce7f-95ab-4abf-9101-bf436cc74bf4-kube-api-access-zdljv\") pod \"telemetry-operator-controller-manager-64b5b76f97-bmzrd\" (UID: \"0319ce7f-95ab-4abf-9101-bf436cc74bf4\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.979059 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc59p\" (UniqueName: \"kubernetes.io/projected/d6956410-92c0-40bf-b1c1-a3353ccf1bbc-kube-api-access-mc59p\") pod \"watcher-operator-controller-manager-7b7dd57594-2p68v\" (UID: \"d6956410-92c0-40bf-b1c1-a3353ccf1bbc\") " pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:24:27 crc kubenswrapper[4870]: I0130 08:24:27.995825 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v"] Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.004511 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.007516 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-k5dk2" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.008307 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.021077 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v"] Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.050561 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.082440 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.082497 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.082628 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmkf8\" (UniqueName: \"kubernetes.io/projected/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-kube-api-access-kmkf8\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.082746 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc59p\" (UniqueName: \"kubernetes.io/projected/d6956410-92c0-40bf-b1c1-a3353ccf1bbc-kube-api-access-mc59p\") pod \"watcher-operator-controller-manager-7b7dd57594-2p68v\" (UID: \"d6956410-92c0-40bf-b1c1-a3353ccf1bbc\") " pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.101911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc59p\" (UniqueName: \"kubernetes.io/projected/d6956410-92c0-40bf-b1c1-a3353ccf1bbc-kube-api-access-mc59p\") pod \"watcher-operator-controller-manager-7b7dd57594-2p68v\" (UID: \"d6956410-92c0-40bf-b1c1-a3353ccf1bbc\") " pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.122349 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.195911 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.195973 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t5t8\" (UniqueName: \"kubernetes.io/projected/b706cc39-6af6-4a91-b2a2-6160148dadae-kube-api-access-8t5t8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sds6v\" (UID: \"b706cc39-6af6-4a91-b2a2-6160148dadae\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.196004 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.196034 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.196071 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmkf8\" (UniqueName: \"kubernetes.io/projected/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-kube-api-access-kmkf8\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196425 4870 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196484 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert podName:be7a26e3-9284-4316-bce7-7bc15c9178bd nodeName:}" failed. No retries permitted until 2026-01-30 08:24:29.19646664 +0000 UTC m=+907.892013749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" (UID: "be7a26e3-9284-4316-bce7-7bc15c9178bd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196531 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196573 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:28.696557973 +0000 UTC m=+907.392105082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196612 4870 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.196631 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:28.696625855 +0000 UTC m=+907.392172964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "metrics-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.216480 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmkf8\" (UniqueName: \"kubernetes.io/projected/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-kube-api-access-kmkf8\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.262506 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.300168 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t5t8\" (UniqueName: \"kubernetes.io/projected/b706cc39-6af6-4a91-b2a2-6160148dadae-kube-api-access-8t5t8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sds6v\" (UID: \"b706cc39-6af6-4a91-b2a2-6160148dadae\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.322294 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t5t8\" (UniqueName: \"kubernetes.io/projected/b706cc39-6af6-4a91-b2a2-6160148dadae-kube-api-access-8t5t8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sds6v\" (UID: \"b706cc39-6af6-4a91-b2a2-6160148dadae\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.325851 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.368642 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9"] Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.379227 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq"] Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.412372 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.457853 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.478742 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.712608 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.712678 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.713160 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.713276 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:29.713247236 +0000 UTC m=+908.408794555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.713179 4870 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.713370 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:29.713344229 +0000 UTC m=+908.408891338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "metrics-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: I0130 08:24:28.919216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.919578 4870 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:28 crc kubenswrapper[4870]: E0130 08:24:28.919777 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert podName:46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:30.919752054 +0000 UTC m=+909.615299163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert") pod "infra-operator-controller-manager-79955696d6-spzcf" (UID: "46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05") : secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.108097 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" event={"ID":"54c01287-d66d-46bc-bbb8-7532263099c5","Type":"ContainerStarted","Data":"8705bbe514d4febdd908e9cb1e80f09f3b9bcc97d67c37a69fbdadce948fae15"} Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.112703 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" event={"ID":"e973c5f3-3291-4d4b-85ce-806ef6f83c1a","Type":"ContainerStarted","Data":"8a04e10fbb8077fb4534634bfe13f1479349ee92c03526701e6408a74dfdd9ea"} Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.149970 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.157239 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.176804 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7"] Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.189750 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925313c0_6800_4a27_814b_887b46cf49ad.slice/crio-ca802d3170127a339a07e883131ed9b8367a04fddd9be1cc41a5c73076371779 WatchSource:0}: Error finding container ca802d3170127a339a07e883131ed9b8367a04fddd9be1cc41a5c73076371779: Status 404 returned error can't find the container with id ca802d3170127a339a07e883131ed9b8367a04fddd9be1cc41a5c73076371779 Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.193117 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96be73fb_f1fc_4c5c_a643_7b9dcc832ac6.slice/crio-82c40ea0caf56a142d720efdb33fcb2f16da6b40e6b95ca18f92f2e50b525477 WatchSource:0}: Error finding container 82c40ea0caf56a142d720efdb33fcb2f16da6b40e6b95ca18f92f2e50b525477: Status 404 returned error can't find the container with id 82c40ea0caf56a142d720efdb33fcb2f16da6b40e6b95ca18f92f2e50b525477 Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.198532 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5680ceb3_f5ec_4d9e_a313_13564402bff2.slice/crio-430ba50c6674f91e8d2de033cf5bdffa029096a44c6b3bbbed4448a2eef40fd6 WatchSource:0}: Error finding container 430ba50c6674f91e8d2de033cf5bdffa029096a44c6b3bbbed4448a2eef40fd6: Status 404 returned error can't find the container with id 430ba50c6674f91e8d2de033cf5bdffa029096a44c6b3bbbed4448a2eef40fd6 Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.213614 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.224719 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.224973 4870 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.225028 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert podName:be7a26e3-9284-4316-bce7-7bc15c9178bd nodeName:}" failed. No retries permitted until 2026-01-30 08:24:31.225009834 +0000 UTC m=+909.920556943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" (UID: "be7a26e3-9284-4316-bce7-7bc15c9178bd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.225396 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.234234 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.242070 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg"] Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.249464 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9449ead_e087_4895_a88a_8bdfe0835ebd.slice/crio-dbb168fdb23ded393de0c55f2ce0624be8483a0e7c9d08310feed200990f390d WatchSource:0}: Error finding container dbb168fdb23ded393de0c55f2ce0624be8483a0e7c9d08310feed200990f390d: Status 404 returned error can't find the container with id dbb168fdb23ded393de0c55f2ce0624be8483a0e7c9d08310feed200990f390d Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.254427 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh"] Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.270384 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod604ff246_0f47_4c2c_8940_d76f10dce14e.slice/crio-efa1c9f6390852a8cb8c82ce7bb8f5364d8d5c898176f3124e9b3a5e46328118 WatchSource:0}: Error finding container efa1c9f6390852a8cb8c82ce7bb8f5364d8d5c898176f3124e9b3a5e46328118: Status 404 returned error can't find the container with id efa1c9f6390852a8cb8c82ce7bb8f5364d8d5c898176f3124e9b3a5e46328118 Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.272959 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-497sn"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.287198 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq"] Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.292176 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2q5dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-mx5xp_openstack-operators(274d3a56-3caf-4dd2-b122-e3b45a3eec6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.293447 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdljv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-bmzrd_openstack-operators(0319ce7f-95ab-4abf-9101-bf436cc74bf4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.293672 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tldkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-t8ncr_openstack-operators(378c24d4-b8c1-4cd2-a85c-8449aa00ad3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.293773 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" podUID="274d3a56-3caf-4dd2-b122-e3b45a3eec6e" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.294589 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" podUID="0319ce7f-95ab-4abf-9101-bf436cc74bf4" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.294763 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" podUID="378c24d4-b8c1-4cd2-a85c-8449aa00ad3e" Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.296469 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9257db_1c02_4160_9c89_7df62f2ce602.slice/crio-b81a8e49e16a8195923ed1b679ffd27f372e0b1b5637d0796086f6d8f92cebbd WatchSource:0}: Error finding container b81a8e49e16a8195923ed1b679ffd27f372e0b1b5637d0796086f6d8f92cebbd: Status 404 returned error can't find the container with id b81a8e49e16a8195923ed1b679ffd27f372e0b1b5637d0796086f6d8f92cebbd Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.296585 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v65mv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-cpn6f_openstack-operators(604ff246-0f47-4c2c-8940-d76f10dce14e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.296695 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6956410_92c0_40bf_b1c1_a3353ccf1bbc.slice/crio-bcba828776f5fc5a4406acc7be034578a26f7703ed3d183bfda40ac288adebdd WatchSource:0}: Error finding container bcba828776f5fc5a4406acc7be034578a26f7703ed3d183bfda40ac288adebdd: Status 404 returned error can't find the container with id bcba828776f5fc5a4406acc7be034578a26f7703ed3d183bfda40ac288adebdd Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.297684 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" podUID="604ff246-0f47-4c2c-8940-d76f10dce14e" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.299310 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgr5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-t4hbm_openstack-operators(ec9257db-1c02-4160-9c89-7df62f2ce602): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.300252 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn"] Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.300395 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" podUID="ec9257db-1c02-4160-9c89-7df62f2ce602" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.300922 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mc59p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7b7dd57594-2p68v_openstack-operators(d6956410-92c0-40bf-b1c1-a3353ccf1bbc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.302138 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podUID="d6956410-92c0-40bf-b1c1-a3353ccf1bbc" Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.305902 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.310213 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.314713 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.320179 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.327488 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v"] Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.409090 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst"] Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.413535 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb7aeba5_92f5_4887_9a6a_92d8c57650d2.slice/crio-fc3a729f7ef0f0af0b616dd69825f76822f32f67675fdcbd46a004f703b9cd9e WatchSource:0}: Error finding container fc3a729f7ef0f0af0b616dd69825f76822f32f67675fdcbd46a004f703b9cd9e: Status 404 returned error can't find the container with id fc3a729f7ef0f0af0b616dd69825f76822f32f67675fdcbd46a004f703b9cd9e Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.413799 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v"] Jan 30 08:24:29 crc kubenswrapper[4870]: W0130 08:24:29.417630 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb706cc39_6af6_4a91_b2a2_6160148dadae.slice/crio-e899411b43da84d3c9e618f4eb95433909d455250c61a692462ce159867785e7 WatchSource:0}: Error finding container e899411b43da84d3c9e618f4eb95433909d455250c61a692462ce159867785e7: Status 404 returned error can't find the container with id e899411b43da84d3c9e618f4eb95433909d455250c61a692462ce159867785e7 Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.423019 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8t5t8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-sds6v_openstack-operators(b706cc39-6af6-4a91-b2a2-6160148dadae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.424198 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podUID="b706cc39-6af6-4a91-b2a2-6160148dadae" Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.735432 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:29 crc kubenswrapper[4870]: I0130 08:24:29.735482 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.735614 4870 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.735666 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:31.735651395 +0000 UTC m=+910.431198504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "metrics-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.735713 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:29 crc kubenswrapper[4870]: E0130 08:24:29.735732 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:31.735726558 +0000 UTC m=+910.431273667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.127500 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" event={"ID":"2ee622d2-acd4-4eec-9fbb-12b5bae7e32f","Type":"ContainerStarted","Data":"5e966fb0d4d9db3f6a49c9436841c6e3dae3122a6b90541799ae68ad023c88ec"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.129415 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" event={"ID":"0319ce7f-95ab-4abf-9101-bf436cc74bf4","Type":"ContainerStarted","Data":"7566164dbbab2b8432786bcdc9c85380d0a409887d56688baf277e75721e2c55"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.131506 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" podUID="0319ce7f-95ab-4abf-9101-bf436cc74bf4" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.134585 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" event={"ID":"0ea209e2-96bf-4919-ad8f-f86de2b78ab1","Type":"ContainerStarted","Data":"7ec4abcee0d4012ce955c11273afea1f91c53942b41877f8442abee9bd67ed82"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.146041 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" event={"ID":"ea3efedd-cb74-48c7-b246-b188bac37ed4","Type":"ContainerStarted","Data":"8b682904f00174b2d5eaa8f91f6b0b3572ae203a2564c6cbe17dab8378368141"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.163943 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" event={"ID":"96be73fb-f1fc-4c5c-a643-7b9dcc832ac6","Type":"ContainerStarted","Data":"82c40ea0caf56a142d720efdb33fcb2f16da6b40e6b95ca18f92f2e50b525477"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.167205 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" event={"ID":"ec9257db-1c02-4160-9c89-7df62f2ce602","Type":"ContainerStarted","Data":"b81a8e49e16a8195923ed1b679ffd27f372e0b1b5637d0796086f6d8f92cebbd"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.169450 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" podUID="ec9257db-1c02-4160-9c89-7df62f2ce602" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.170281 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" event={"ID":"2de7363a-3627-42bb-a58f-7bad2e414192","Type":"ContainerStarted","Data":"13277e641351b1df4c23b359bf476808de89736339965a838a498c9846c83476"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.176956 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" event={"ID":"604ff246-0f47-4c2c-8940-d76f10dce14e","Type":"ContainerStarted","Data":"efa1c9f6390852a8cb8c82ce7bb8f5364d8d5c898176f3124e9b3a5e46328118"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.178463 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" podUID="604ff246-0f47-4c2c-8940-d76f10dce14e" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.178999 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" event={"ID":"d6956410-92c0-40bf-b1c1-a3353ccf1bbc","Type":"ContainerStarted","Data":"bcba828776f5fc5a4406acc7be034578a26f7703ed3d183bfda40ac288adebdd"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.189413 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podUID="d6956410-92c0-40bf-b1c1-a3353ccf1bbc" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.190907 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" event={"ID":"378c24d4-b8c1-4cd2-a85c-8449aa00ad3e","Type":"ContainerStarted","Data":"4bc7a92459fbfcf89605cd042fa8fcf00f27466aa87b5d958647ba965095155c"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.193443 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" podUID="378c24d4-b8c1-4cd2-a85c-8449aa00ad3e" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.209859 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" event={"ID":"b9449ead-e087-4895-a88a-8bdfe0835ebd","Type":"ContainerStarted","Data":"dbb168fdb23ded393de0c55f2ce0624be8483a0e7c9d08310feed200990f390d"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.218022 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" event={"ID":"db7aeba5-92f5-4887-9a6a-92d8c57650d2","Type":"ContainerStarted","Data":"fc3a729f7ef0f0af0b616dd69825f76822f32f67675fdcbd46a004f703b9cd9e"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.221522 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" event={"ID":"274d3a56-3caf-4dd2-b122-e3b45a3eec6e","Type":"ContainerStarted","Data":"c310807e5e1c21bf87a891d4a2bbccb9034a44097c57374f0aa07b0c7b91d9b4"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.230841 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" podUID="274d3a56-3caf-4dd2-b122-e3b45a3eec6e" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.230846 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" event={"ID":"5cde6cc5-f427-4349-8c8a-3dce0deac5a9","Type":"ContainerStarted","Data":"63d0eaa47e604ad7f574bc6198cf9c5140d4f342d58b36801d5f5793073ef324"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.245830 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" event={"ID":"925313c0-6800-4a27-814b-887b46cf49ad","Type":"ContainerStarted","Data":"ca802d3170127a339a07e883131ed9b8367a04fddd9be1cc41a5c73076371779"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.250971 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" event={"ID":"5680ceb3-f5ec-4d9e-a313-13564402bff2","Type":"ContainerStarted","Data":"430ba50c6674f91e8d2de033cf5bdffa029096a44c6b3bbbed4448a2eef40fd6"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.259569 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" event={"ID":"dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb","Type":"ContainerStarted","Data":"791f88e2d4e65862eed61d7c949be76a3cfad83eb03322249e0945038c994e83"} Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.261558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" event={"ID":"b706cc39-6af6-4a91-b2a2-6160148dadae","Type":"ContainerStarted","Data":"e899411b43da84d3c9e618f4eb95433909d455250c61a692462ce159867785e7"} Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.264215 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podUID="b706cc39-6af6-4a91-b2a2-6160148dadae" Jan 30 08:24:30 crc kubenswrapper[4870]: I0130 08:24:30.958622 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.958785 4870 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:30 crc kubenswrapper[4870]: E0130 08:24:30.958847 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert podName:46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:34.958825984 +0000 UTC m=+913.654373093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert") pod "infra-operator-controller-manager-79955696d6-spzcf" (UID: "46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05") : secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: I0130 08:24:31.263556 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.263758 4870 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.263861 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert podName:be7a26e3-9284-4316-bce7-7bc15c9178bd nodeName:}" failed. No retries permitted until 2026-01-30 08:24:35.263827686 +0000 UTC m=+913.959374795 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" (UID: "be7a26e3-9284-4316-bce7-7bc15c9178bd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277455 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" podUID="0319ce7f-95ab-4abf-9101-bf436cc74bf4" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277498 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" podUID="604ff246-0f47-4c2c-8940-d76f10dce14e" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277732 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" podUID="274d3a56-3caf-4dd2-b122-e3b45a3eec6e" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277811 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" podUID="ec9257db-1c02-4160-9c89-7df62f2ce602" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277856 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" podUID="378c24d4-b8c1-4cd2-a85c-8449aa00ad3e" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.277916 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podUID="d6956410-92c0-40bf-b1c1-a3353ccf1bbc" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.283598 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podUID="b706cc39-6af6-4a91-b2a2-6160148dadae" Jan 30 08:24:31 crc kubenswrapper[4870]: I0130 08:24:31.781787 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:31 crc kubenswrapper[4870]: I0130 08:24:31.782064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.782000 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.782155 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:35.782137038 +0000 UTC m=+914.477684147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.782349 4870 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 08:24:31 crc kubenswrapper[4870]: E0130 08:24:31.782444 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:35.782415257 +0000 UTC m=+914.477962426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "metrics-server-cert" not found Jan 30 08:24:34 crc kubenswrapper[4870]: I0130 08:24:34.961114 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:34 crc kubenswrapper[4870]: E0130 08:24:34.962465 4870 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:34 crc kubenswrapper[4870]: E0130 08:24:34.962559 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert podName:46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:42.962527344 +0000 UTC m=+921.658074453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert") pod "infra-operator-controller-manager-79955696d6-spzcf" (UID: "46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05") : secret "infra-operator-webhook-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: I0130 08:24:35.269443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.269678 4870 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.269722 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert podName:be7a26e3-9284-4316-bce7-7bc15c9178bd nodeName:}" failed. No retries permitted until 2026-01-30 08:24:43.269709275 +0000 UTC m=+921.965256384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" (UID: "be7a26e3-9284-4316-bce7-7bc15c9178bd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: I0130 08:24:35.879202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:35 crc kubenswrapper[4870]: I0130 08:24:35.879284 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.879487 4870 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.879555 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:43.879533003 +0000 UTC m=+922.575080152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "metrics-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.880193 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:35 crc kubenswrapper[4870]: E0130 08:24:35.880274 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:43.880255206 +0000 UTC m=+922.575802315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:40 crc kubenswrapper[4870]: E0130 08:24:40.904139 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8" Jan 30 08:24:40 crc kubenswrapper[4870]: E0130 08:24:40.906495 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tgrl7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5fb775575f-hbmf7_openstack-operators(925313c0-6800-4a27-814b-887b46cf49ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:24:40 crc kubenswrapper[4870]: E0130 08:24:40.909032 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" podUID="925313c0-6800-4a27-814b-887b46cf49ad" Jan 30 08:24:41 crc kubenswrapper[4870]: E0130 08:24:41.350426 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" podUID="925313c0-6800-4a27-814b-887b46cf49ad" Jan 30 08:24:41 crc kubenswrapper[4870]: E0130 08:24:41.610230 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521" Jan 30 08:24:41 crc kubenswrapper[4870]: E0130 08:24:41.610423 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hw8fb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f4b8bd54d-5vfrj_openstack-operators(5680ceb3-f5ec-4d9e-a313-13564402bff2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:24:41 crc kubenswrapper[4870]: E0130 08:24:41.611741 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" podUID="5680ceb3-f5ec-4d9e-a313-13564402bff2" Jan 30 08:24:42 crc kubenswrapper[4870]: E0130 08:24:42.359495 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" podUID="5680ceb3-f5ec-4d9e-a313-13564402bff2" Jan 30 08:24:42 crc kubenswrapper[4870]: I0130 08:24:42.997408 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.009062 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05-cert\") pod \"infra-operator-controller-manager-79955696d6-spzcf\" (UID: \"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.013075 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.311109 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.317221 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be7a26e3-9284-4316-bce7-7bc15c9178bd-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8\" (UID: \"be7a26e3-9284-4316-bce7-7bc15c9178bd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.502277 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" event={"ID":"96be73fb-f1fc-4c5c-a643-7b9dcc832ac6","Type":"ContainerStarted","Data":"302f46534efc0fda5e49b4b3278427082e37cfa32b57b945fc313de4d8c47308"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.504328 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.527868 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" event={"ID":"b9449ead-e087-4895-a88a-8bdfe0835ebd","Type":"ContainerStarted","Data":"443f92cafe17d3910cda94dced8968db9fb003f51da9a07a954f5a3c1e55275b"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.528441 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.556123 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.558344 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-spzcf"] Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.560211 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" event={"ID":"0ea209e2-96bf-4919-ad8f-f86de2b78ab1","Type":"ContainerStarted","Data":"0cefd9143f119d915c5d94a3d3fe169e9f64975a40722147c1788c569df20f20"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.562323 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.575646 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" event={"ID":"dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb","Type":"ContainerStarted","Data":"21728291b78bccdf9e1cf02baa6b13e5ba2f0048f1783ece933c55b7e2b671e8"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.576915 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.594293 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" podStartSLOduration=4.5183341630000005 podStartE2EDuration="17.594278009s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.198603984 +0000 UTC m=+907.894151093" lastFinishedPulling="2026-01-30 08:24:42.27454783 +0000 UTC m=+920.970094939" observedRunningTime="2026-01-30 08:24:43.591609925 +0000 UTC m=+922.287157034" watchObservedRunningTime="2026-01-30 08:24:43.594278009 +0000 UTC m=+922.289825118" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.606145 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" event={"ID":"ea3efedd-cb74-48c7-b246-b188bac37ed4","Type":"ContainerStarted","Data":"f74db5cd71c9fa468a550c1a71b8858ac93d619f5fc9aef5454f80dde5103d39"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.606941 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.655149 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" event={"ID":"5cde6cc5-f427-4349-8c8a-3dce0deac5a9","Type":"ContainerStarted","Data":"0a7ee0d2b5f000d0a431c934ca253da22aa50f69dd25e09135b5e3388d5508bf"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.656448 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.687376 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" event={"ID":"2de7363a-3627-42bb-a58f-7bad2e414192","Type":"ContainerStarted","Data":"21bf925c73efb6176960423e3fdb3a3dffee536189cff78b10416c323cdc1a23"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.688327 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.708774 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" podStartSLOduration=4.647242642 podStartE2EDuration="17.708749975s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.213093389 +0000 UTC m=+907.908640498" lastFinishedPulling="2026-01-30 08:24:42.274600712 +0000 UTC m=+920.970147831" observedRunningTime="2026-01-30 08:24:43.707139625 +0000 UTC m=+922.402686734" watchObservedRunningTime="2026-01-30 08:24:43.708749975 +0000 UTC m=+922.404297084" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.711420 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" podStartSLOduration=4.690552554 podStartE2EDuration="17.711412179s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.253781738 +0000 UTC m=+907.949328847" lastFinishedPulling="2026-01-30 08:24:42.274641323 +0000 UTC m=+920.970188472" observedRunningTime="2026-01-30 08:24:43.65667576 +0000 UTC m=+922.352222869" watchObservedRunningTime="2026-01-30 08:24:43.711412179 +0000 UTC m=+922.406959288" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.719189 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" event={"ID":"db7aeba5-92f5-4887-9a6a-92d8c57650d2","Type":"ContainerStarted","Data":"46c705cc2a19f3d3043d9480b4eb1564d5f732cff2cd7bbfa6f1a8fed8972571"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.719237 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.730077 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" event={"ID":"54c01287-d66d-46bc-bbb8-7532263099c5","Type":"ContainerStarted","Data":"5eb0dd4b48f0c8ec718f0a8040f2d3be356550119225643713a48899a64c778b"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.730492 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.745745 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" event={"ID":"e973c5f3-3291-4d4b-85ce-806ef6f83c1a","Type":"ContainerStarted","Data":"b9012e1f5cd8f7cf422d1d64797178731df193bb4b5da53ea6c7480a02f4188e"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.746559 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.747264 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" podStartSLOduration=4.757318383 podStartE2EDuration="17.747253056s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.284307357 +0000 UTC m=+907.979854456" lastFinishedPulling="2026-01-30 08:24:42.27424202 +0000 UTC m=+920.969789129" observedRunningTime="2026-01-30 08:24:43.737266501 +0000 UTC m=+922.432813610" watchObservedRunningTime="2026-01-30 08:24:43.747253056 +0000 UTC m=+922.442800165" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.764115 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" event={"ID":"2ee622d2-acd4-4eec-9fbb-12b5bae7e32f","Type":"ContainerStarted","Data":"98939eb02384bd35f3d01780dc85c078ff3d2a6c6ff0cd99f8a632ace96c7325"} Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.765083 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.779644 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" podStartSLOduration=4.814153528 podStartE2EDuration="17.779616192s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.415910921 +0000 UTC m=+908.111458030" lastFinishedPulling="2026-01-30 08:24:42.381373545 +0000 UTC m=+921.076920694" observedRunningTime="2026-01-30 08:24:43.778370972 +0000 UTC m=+922.473918081" watchObservedRunningTime="2026-01-30 08:24:43.779616192 +0000 UTC m=+922.475163301" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.812718 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" podStartSLOduration=3.802834061 podStartE2EDuration="16.812693531s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.264836555 +0000 UTC m=+907.960383664" lastFinishedPulling="2026-01-30 08:24:42.274696025 +0000 UTC m=+920.970243134" observedRunningTime="2026-01-30 08:24:43.805563467 +0000 UTC m=+922.501110576" watchObservedRunningTime="2026-01-30 08:24:43.812693531 +0000 UTC m=+922.508240640" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.866831 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" podStartSLOduration=4.807320892 podStartE2EDuration="17.866815071s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.215005189 +0000 UTC m=+907.910552298" lastFinishedPulling="2026-01-30 08:24:42.274499368 +0000 UTC m=+920.970046477" observedRunningTime="2026-01-30 08:24:43.841209847 +0000 UTC m=+922.536756956" watchObservedRunningTime="2026-01-30 08:24:43.866815071 +0000 UTC m=+922.562362180" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.868974 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" podStartSLOduration=4.798588758 podStartE2EDuration="17.868964019s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.265237007 +0000 UTC m=+907.960784116" lastFinishedPulling="2026-01-30 08:24:42.335612228 +0000 UTC m=+921.031159377" observedRunningTime="2026-01-30 08:24:43.864105486 +0000 UTC m=+922.559652595" watchObservedRunningTime="2026-01-30 08:24:43.868964019 +0000 UTC m=+922.564511128" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.897635 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" podStartSLOduration=4.1018523 podStartE2EDuration="17.897619179s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:28.478514092 +0000 UTC m=+907.174061201" lastFinishedPulling="2026-01-30 08:24:42.274280971 +0000 UTC m=+920.969828080" observedRunningTime="2026-01-30 08:24:43.893507129 +0000 UTC m=+922.589054248" watchObservedRunningTime="2026-01-30 08:24:43.897619179 +0000 UTC m=+922.593166288" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.936173 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.936227 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:43 crc kubenswrapper[4870]: E0130 08:24:43.942643 4870 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 08:24:43 crc kubenswrapper[4870]: E0130 08:24:43.942703 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs podName:fcdb20a3-7229-48e6-8f12-d1b6a5c892f3 nodeName:}" failed. No retries permitted until 2026-01-30 08:24:59.942686565 +0000 UTC m=+938.638233674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs") pod "openstack-operator-controller-manager-65544cf747-sgxjd" (UID: "fcdb20a3-7229-48e6-8f12-d1b6a5c892f3") : secret "webhook-server-cert" not found Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.943644 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-metrics-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:24:43 crc kubenswrapper[4870]: I0130 08:24:43.955236 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" podStartSLOduration=4.1598661 podStartE2EDuration="17.955196517s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:28.478963655 +0000 UTC m=+907.174510764" lastFinishedPulling="2026-01-30 08:24:42.274294072 +0000 UTC m=+920.969841181" observedRunningTime="2026-01-30 08:24:43.922440749 +0000 UTC m=+922.617987868" watchObservedRunningTime="2026-01-30 08:24:43.955196517 +0000 UTC m=+922.650743626" Jan 30 08:24:44 crc kubenswrapper[4870]: I0130 08:24:44.139254 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" podStartSLOduration=4.115151483 podStartE2EDuration="17.13923777s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.250116662 +0000 UTC m=+907.945663771" lastFinishedPulling="2026-01-30 08:24:42.274202939 +0000 UTC m=+920.969750058" observedRunningTime="2026-01-30 08:24:43.962396224 +0000 UTC m=+922.657943333" watchObservedRunningTime="2026-01-30 08:24:44.13923777 +0000 UTC m=+922.834784879" Jan 30 08:24:44 crc kubenswrapper[4870]: I0130 08:24:44.144939 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8"] Jan 30 08:24:44 crc kubenswrapper[4870]: W0130 08:24:44.155861 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe7a26e3_9284_4316_bce7_7bc15c9178bd.slice/crio-fac987e50460bd25b1ae104eb4a6f63c98f74cefb841dc670a1788504ce60fba WatchSource:0}: Error finding container fac987e50460bd25b1ae104eb4a6f63c98f74cefb841dc670a1788504ce60fba: Status 404 returned error can't find the container with id fac987e50460bd25b1ae104eb4a6f63c98f74cefb841dc670a1788504ce60fba Jan 30 08:24:44 crc kubenswrapper[4870]: I0130 08:24:44.788340 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" event={"ID":"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05","Type":"ContainerStarted","Data":"027bbda16f8aafae2cbe7d992287e32c0f034c4bb79913d28b86fd7dde0f3cea"} Jan 30 08:24:44 crc kubenswrapper[4870]: I0130 08:24:44.801039 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" event={"ID":"be7a26e3-9284-4316-bce7-7bc15c9178bd","Type":"ContainerStarted","Data":"fac987e50460bd25b1ae104eb4a6f63c98f74cefb841dc670a1788504ce60fba"} Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.111991 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-hsfpq" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.167719 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-grbz8" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.174251 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tkrpg" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.484469 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-rhfst" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.570320 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-59rt2" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.603924 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-j9bdn" Jan 30 08:24:47 crc kubenswrapper[4870]: I0130 08:24:47.616432 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2xdfh" Jan 30 08:24:48 crc kubenswrapper[4870]: I0130 08:24:48.131646 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-497sn" Jan 30 08:24:55 crc kubenswrapper[4870]: I0130 08:24:55.249430 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:24:55 crc kubenswrapper[4870]: I0130 08:24:55.250343 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:24:57 crc kubenswrapper[4870]: I0130 08:24:57.101974 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-wfpg9" Jan 30 08:24:57 crc kubenswrapper[4870]: I0130 08:24:57.249718 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-wlkxq" Jan 30 08:24:57 crc kubenswrapper[4870]: I0130 08:24:57.800559 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-4sftq" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.170444 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.171762 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.184343 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.313120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.313352 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q99jp\" (UniqueName: \"kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.313624 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.415043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q99jp\" (UniqueName: \"kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.415156 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.415220 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.415767 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.416396 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.439777 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q99jp\" (UniqueName: \"kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp\") pod \"redhat-marketplace-l8t5h\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:24:59 crc kubenswrapper[4870]: I0130 08:24:59.497144 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:00 crc kubenswrapper[4870]: I0130 08:25:00.023190 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:25:00 crc kubenswrapper[4870]: I0130 08:25:00.029355 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fcdb20a3-7229-48e6-8f12-d1b6a5c892f3-webhook-certs\") pod \"openstack-operator-controller-manager-65544cf747-sgxjd\" (UID: \"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3\") " pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:25:00 crc kubenswrapper[4870]: I0130 08:25:00.245369 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:25:05 crc kubenswrapper[4870]: E0130 08:25:05.228907 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e" Jan 30 08:25:05 crc kubenswrapper[4870]: E0130 08:25:05.229974 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e" Jan 30 08:25:05 crc kubenswrapper[4870]: E0130 08:25:05.230656 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mc59p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7b7dd57594-2p68v_openstack-operators(d6956410-92c0-40bf-b1c1-a3353ccf1bbc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:25:05 crc kubenswrapper[4870]: E0130 08:25:05.231892 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podUID="d6956410-92c0-40bf-b1c1-a3353ccf1bbc" Jan 30 08:25:06 crc kubenswrapper[4870]: E0130 08:25:06.026489 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 30 08:25:06 crc kubenswrapper[4870]: E0130 08:25:06.026822 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8t5t8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-sds6v_openstack-operators(b706cc39-6af6-4a91-b2a2-6160148dadae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:25:06 crc kubenswrapper[4870]: E0130 08:25:06.029094 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podUID="b706cc39-6af6-4a91-b2a2-6160148dadae" Jan 30 08:25:06 crc kubenswrapper[4870]: I0130 08:25:06.592820 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:25:06 crc kubenswrapper[4870]: W0130 08:25:06.593653 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa WatchSource:0}: Error finding container 424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa: Status 404 returned error can't find the container with id 424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa Jan 30 08:25:06 crc kubenswrapper[4870]: I0130 08:25:06.654800 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd"] Jan 30 08:25:06 crc kubenswrapper[4870]: W0130 08:25:06.664832 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcdb20a3_7229_48e6_8f12_d1b6a5c892f3.slice/crio-b7388ad560dc0f84a65d0db103d1f861e42a83bf603f3f58c3147cd10324a7ed WatchSource:0}: Error finding container b7388ad560dc0f84a65d0db103d1f861e42a83bf603f3f58c3147cd10324a7ed: Status 404 returned error can't find the container with id b7388ad560dc0f84a65d0db103d1f861e42a83bf603f3f58c3147cd10324a7ed Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.082384 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" event={"ID":"ec9257db-1c02-4160-9c89-7df62f2ce602","Type":"ContainerStarted","Data":"598d8e0fbfa60e302ba1d679c6fd723141d602598c2deee6fd9f2aa0231d4429"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.083209 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.084399 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" event={"ID":"604ff246-0f47-4c2c-8940-d76f10dce14e","Type":"ContainerStarted","Data":"b8f6e228a17aac34e5cc8941d1a4b4ab4e8038bf3d42af6eda2e1be33010cbf8"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.084782 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.085568 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" event={"ID":"5680ceb3-f5ec-4d9e-a313-13564402bff2","Type":"ContainerStarted","Data":"ac2e2091012196765bb594492445d2d5c11ca198f17d2cd3f876f5046d2331b0"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.086041 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.088206 4870 generic.go:334] "Generic (PLEG): container finished" podID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerID="3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f" exitCode=0 Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.088367 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerDied","Data":"3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.088965 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerStarted","Data":"424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.094026 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" event={"ID":"274d3a56-3caf-4dd2-b122-e3b45a3eec6e","Type":"ContainerStarted","Data":"a20b2399deab0ca006f954d363d3c523d2c4eb3c8b08c0093d9f574dfb14ed99"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.094295 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.098127 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" event={"ID":"be7a26e3-9284-4316-bce7-7bc15c9178bd","Type":"ContainerStarted","Data":"12ce709e6f07eb4baf6bc7c9d5f6268879da8c79f958c70510442e10096acd70"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.099029 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.104155 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" event={"ID":"925313c0-6800-4a27-814b-887b46cf49ad","Type":"ContainerStarted","Data":"3ec03c81bc3012aae78444827de3e3fb7bfa7f82bb77e8175a041a07e79f6eea"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.104397 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.105667 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" event={"ID":"378c24d4-b8c1-4cd2-a85c-8449aa00ad3e","Type":"ContainerStarted","Data":"9436a32bd8ad9b0e7f50cc8c9adeca28d70eba9075b08895e63f97d61c9c40ea"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.105808 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.107545 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" event={"ID":"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3","Type":"ContainerStarted","Data":"af1ea79b9c2d897c86152d93233c485038bf92e180307cc4bc3dd3ecf4428494"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.107576 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" event={"ID":"fcdb20a3-7229-48e6-8f12-d1b6a5c892f3","Type":"ContainerStarted","Data":"b7388ad560dc0f84a65d0db103d1f861e42a83bf603f3f58c3147cd10324a7ed"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.107697 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.108929 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" event={"ID":"0319ce7f-95ab-4abf-9101-bf436cc74bf4","Type":"ContainerStarted","Data":"462af9c4d0b02aafa48258f02270d794a1bbe41eb7b67e1441f92558cd4cc074"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.109284 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.111399 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" event={"ID":"46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05","Type":"ContainerStarted","Data":"01d7492f410f9e9fd7af41951b5d1897a037791a5dc63ee5379feb50748737a7"} Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.111662 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.124577 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" podStartSLOduration=7.176622634 podStartE2EDuration="40.124550699s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.299182313 +0000 UTC m=+907.994729422" lastFinishedPulling="2026-01-30 08:25:02.247110338 +0000 UTC m=+940.942657487" observedRunningTime="2026-01-30 08:25:07.114256576 +0000 UTC m=+945.809803685" watchObservedRunningTime="2026-01-30 08:25:07.124550699 +0000 UTC m=+945.820097808" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.146013 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" podStartSLOduration=9.1328392 podStartE2EDuration="40.145998443s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.293583937 +0000 UTC m=+907.989131046" lastFinishedPulling="2026-01-30 08:25:00.30674317 +0000 UTC m=+939.002290289" observedRunningTime="2026-01-30 08:25:07.139250141 +0000 UTC m=+945.834797250" watchObservedRunningTime="2026-01-30 08:25:07.145998443 +0000 UTC m=+945.841545552" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.174812 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" podStartSLOduration=4.359800194 podStartE2EDuration="41.174793427s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.213026557 +0000 UTC m=+907.908573666" lastFinishedPulling="2026-01-30 08:25:06.0280198 +0000 UTC m=+944.723566899" observedRunningTime="2026-01-30 08:25:07.173613381 +0000 UTC m=+945.869160490" watchObservedRunningTime="2026-01-30 08:25:07.174793427 +0000 UTC m=+945.870340536" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.217820 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" podStartSLOduration=4.3635791919999996 podStartE2EDuration="41.217802298s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.191687817 +0000 UTC m=+907.887234926" lastFinishedPulling="2026-01-30 08:25:06.045910913 +0000 UTC m=+944.741458032" observedRunningTime="2026-01-30 08:25:07.21660062 +0000 UTC m=+945.912147729" watchObservedRunningTime="2026-01-30 08:25:07.217802298 +0000 UTC m=+945.913349407" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.296066 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" podStartSLOduration=19.258309283 podStartE2EDuration="40.296040796s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:44.166455915 +0000 UTC m=+922.862003024" lastFinishedPulling="2026-01-30 08:25:05.204187398 +0000 UTC m=+943.899734537" observedRunningTime="2026-01-30 08:25:07.295267102 +0000 UTC m=+945.990814211" watchObservedRunningTime="2026-01-30 08:25:07.296040796 +0000 UTC m=+945.991587905" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.391545 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" podStartSLOduration=40.391517166 podStartE2EDuration="40.391517166s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:25:07.378948241 +0000 UTC m=+946.074495350" watchObservedRunningTime="2026-01-30 08:25:07.391517166 +0000 UTC m=+946.087064275" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.476138 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" podStartSLOduration=3.770074524 podStartE2EDuration="40.476105894s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.293292228 +0000 UTC m=+907.988839337" lastFinishedPulling="2026-01-30 08:25:05.999323558 +0000 UTC m=+944.694870707" observedRunningTime="2026-01-30 08:25:07.472764349 +0000 UTC m=+946.168311458" watchObservedRunningTime="2026-01-30 08:25:07.476105894 +0000 UTC m=+946.171652993" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.511704 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" podStartSLOduration=3.79670409 podStartE2EDuration="40.511686751s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.292008068 +0000 UTC m=+907.987555167" lastFinishedPulling="2026-01-30 08:25:06.006990689 +0000 UTC m=+944.702537828" observedRunningTime="2026-01-30 08:25:07.509950227 +0000 UTC m=+946.205497336" watchObservedRunningTime="2026-01-30 08:25:07.511686751 +0000 UTC m=+946.207233860" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.542359 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" podStartSLOduration=5.6352795449999995 podStartE2EDuration="41.542343284s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.296514499 +0000 UTC m=+907.992061609" lastFinishedPulling="2026-01-30 08:25:05.203578229 +0000 UTC m=+943.899125348" observedRunningTime="2026-01-30 08:25:07.540564089 +0000 UTC m=+946.236111198" watchObservedRunningTime="2026-01-30 08:25:07.542343284 +0000 UTC m=+946.237890393" Jan 30 08:25:07 crc kubenswrapper[4870]: I0130 08:25:07.575055 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" podStartSLOduration=20.067128576 podStartE2EDuration="41.575031151s" podCreationTimestamp="2026-01-30 08:24:26 +0000 UTC" firstStartedPulling="2026-01-30 08:24:43.590995466 +0000 UTC m=+922.286542575" lastFinishedPulling="2026-01-30 08:25:05.098898021 +0000 UTC m=+943.794445150" observedRunningTime="2026-01-30 08:25:07.56927759 +0000 UTC m=+946.264824699" watchObservedRunningTime="2026-01-30 08:25:07.575031151 +0000 UTC m=+946.270578260" Jan 30 08:25:08 crc kubenswrapper[4870]: I0130 08:25:08.126397 4870 generic.go:334] "Generic (PLEG): container finished" podID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerID="ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9" exitCode=0 Jan 30 08:25:08 crc kubenswrapper[4870]: I0130 08:25:08.127353 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerDied","Data":"ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9"} Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.139043 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerStarted","Data":"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8"} Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.497953 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.498011 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.821608 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8t5h" podStartSLOduration=9.287521785 podStartE2EDuration="10.821578029s" podCreationTimestamp="2026-01-30 08:24:59 +0000 UTC" firstStartedPulling="2026-01-30 08:25:07.089542149 +0000 UTC m=+945.785089258" lastFinishedPulling="2026-01-30 08:25:08.623598383 +0000 UTC m=+947.319145502" observedRunningTime="2026-01-30 08:25:09.168335976 +0000 UTC m=+947.863883085" watchObservedRunningTime="2026-01-30 08:25:09.821578029 +0000 UTC m=+948.517125178" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.828944 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.830766 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.845557 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.887329 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7z55\" (UniqueName: \"kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.887447 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.887491 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.988760 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.988856 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.988996 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7z55\" (UniqueName: \"kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.989445 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:09 crc kubenswrapper[4870]: I0130 08:25:09.989976 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:10 crc kubenswrapper[4870]: I0130 08:25:10.020820 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7z55\" (UniqueName: \"kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55\") pod \"certified-operators-t4nh7\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:10 crc kubenswrapper[4870]: I0130 08:25:10.152176 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:10 crc kubenswrapper[4870]: I0130 08:25:10.474083 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:10 crc kubenswrapper[4870]: W0130 08:25:10.477744 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7766425_f469_4513_b62b_e44e3d3f81bc.slice/crio-bf15e36614073a765c12f2a92a73edc04b67fdb7bd97803fa465e4ec3897310b WatchSource:0}: Error finding container bf15e36614073a765c12f2a92a73edc04b67fdb7bd97803fa465e4ec3897310b: Status 404 returned error can't find the container with id bf15e36614073a765c12f2a92a73edc04b67fdb7bd97803fa465e4ec3897310b Jan 30 08:25:10 crc kubenswrapper[4870]: I0130 08:25:10.548655 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-l8t5h" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="registry-server" probeResult="failure" output=< Jan 30 08:25:10 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:25:10 crc kubenswrapper[4870]: > Jan 30 08:25:11 crc kubenswrapper[4870]: I0130 08:25:11.155540 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerDied","Data":"ced22f80742d52f01c7a2a77934c1cc85ecaeb0560f0d90e44530ff08ba4d9a1"} Jan 30 08:25:11 crc kubenswrapper[4870]: I0130 08:25:11.155218 4870 generic.go:334] "Generic (PLEG): container finished" podID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerID="ced22f80742d52f01c7a2a77934c1cc85ecaeb0560f0d90e44530ff08ba4d9a1" exitCode=0 Jan 30 08:25:11 crc kubenswrapper[4870]: I0130 08:25:11.156255 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerStarted","Data":"bf15e36614073a765c12f2a92a73edc04b67fdb7bd97803fa465e4ec3897310b"} Jan 30 08:25:12 crc kubenswrapper[4870]: I0130 08:25:12.173868 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerStarted","Data":"59dd3f065b079b1913a4f498086fdcee013f59a43677085570cb3e003e1cb9ad"} Jan 30 08:25:13 crc kubenswrapper[4870]: I0130 08:25:13.021526 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-spzcf" Jan 30 08:25:13 crc kubenswrapper[4870]: I0130 08:25:13.184209 4870 generic.go:334] "Generic (PLEG): container finished" podID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerID="59dd3f065b079b1913a4f498086fdcee013f59a43677085570cb3e003e1cb9ad" exitCode=0 Jan 30 08:25:13 crc kubenswrapper[4870]: I0130 08:25:13.184272 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerDied","Data":"59dd3f065b079b1913a4f498086fdcee013f59a43677085570cb3e003e1cb9ad"} Jan 30 08:25:13 crc kubenswrapper[4870]: I0130 08:25:13.571844 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8" Jan 30 08:25:14 crc kubenswrapper[4870]: I0130 08:25:14.208093 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerStarted","Data":"717be5686005143024a4b1fb0b55c3b441fc39613b640266654be36845057b0c"} Jan 30 08:25:14 crc kubenswrapper[4870]: I0130 08:25:14.238367 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4nh7" podStartSLOduration=2.632265974 podStartE2EDuration="5.238349857s" podCreationTimestamp="2026-01-30 08:25:09 +0000 UTC" firstStartedPulling="2026-01-30 08:25:11.157444577 +0000 UTC m=+949.852991716" lastFinishedPulling="2026-01-30 08:25:13.76352849 +0000 UTC m=+952.459075599" observedRunningTime="2026-01-30 08:25:14.228458987 +0000 UTC m=+952.924006106" watchObservedRunningTime="2026-01-30 08:25:14.238349857 +0000 UTC m=+952.933896966" Jan 30 08:25:17 crc kubenswrapper[4870]: I0130 08:25:17.314537 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-hbmf7" Jan 30 08:25:17 crc kubenswrapper[4870]: I0130 08:25:17.440165 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-5vfrj" Jan 30 08:25:17 crc kubenswrapper[4870]: I0130 08:25:17.723066 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-cpn6f" Jan 30 08:25:18 crc kubenswrapper[4870]: I0130 08:25:18.012713 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-bmzrd" Jan 30 08:25:18 crc kubenswrapper[4870]: I0130 08:25:18.054667 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t4hbm" Jan 30 08:25:18 crc kubenswrapper[4870]: I0130 08:25:18.269486 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-mx5xp" Jan 30 08:25:18 crc kubenswrapper[4870]: I0130 08:25:18.328039 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-t8ncr" Jan 30 08:25:19 crc kubenswrapper[4870]: E0130 08:25:19.077226 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podUID="b706cc39-6af6-4a91-b2a2-6160148dadae" Jan 30 08:25:19 crc kubenswrapper[4870]: E0130 08:25:19.077290 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/openstack-k8s-operators/watcher-operator:3bb7e7b472cb523c5e84dcb1d8dd4ef08e04be5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podUID="d6956410-92c0-40bf-b1c1-a3353ccf1bbc" Jan 30 08:25:19 crc kubenswrapper[4870]: I0130 08:25:19.581517 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:19 crc kubenswrapper[4870]: I0130 08:25:19.650663 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:19 crc kubenswrapper[4870]: I0130 08:25:19.820418 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:25:20 crc kubenswrapper[4870]: I0130 08:25:20.152675 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:20 crc kubenswrapper[4870]: I0130 08:25:20.152775 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:20 crc kubenswrapper[4870]: I0130 08:25:20.209135 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:20 crc kubenswrapper[4870]: I0130 08:25:20.255811 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65544cf747-sgxjd" Jan 30 08:25:20 crc kubenswrapper[4870]: I0130 08:25:20.335237 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.282496 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8t5h" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="registry-server" containerID="cri-o://1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8" gracePeriod=2 Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.800159 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.935281 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q99jp\" (UniqueName: \"kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp\") pod \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.935372 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content\") pod \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.935454 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities\") pod \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\" (UID: \"f2ebdb93-c8ce-45c1-b10f-037853cc99d9\") " Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.936454 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities" (OuterVolumeSpecName: "utilities") pod "f2ebdb93-c8ce-45c1-b10f-037853cc99d9" (UID: "f2ebdb93-c8ce-45c1-b10f-037853cc99d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.940765 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp" (OuterVolumeSpecName: "kube-api-access-q99jp") pod "f2ebdb93-c8ce-45c1-b10f-037853cc99d9" (UID: "f2ebdb93-c8ce-45c1-b10f-037853cc99d9"). InnerVolumeSpecName "kube-api-access-q99jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:25:21 crc kubenswrapper[4870]: I0130 08:25:21.966906 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2ebdb93-c8ce-45c1-b10f-037853cc99d9" (UID: "f2ebdb93-c8ce-45c1-b10f-037853cc99d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.038021 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.038069 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.038081 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q99jp\" (UniqueName: \"kubernetes.io/projected/f2ebdb93-c8ce-45c1-b10f-037853cc99d9-kube-api-access-q99jp\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.295362 4870 generic.go:334] "Generic (PLEG): container finished" podID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerID="1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8" exitCode=0 Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.295458 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerDied","Data":"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8"} Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.295513 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8t5h" event={"ID":"f2ebdb93-c8ce-45c1-b10f-037853cc99d9","Type":"ContainerDied","Data":"424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa"} Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.295512 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8t5h" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.295550 4870 scope.go:117] "RemoveContainer" containerID="1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.325842 4870 scope.go:117] "RemoveContainer" containerID="ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.330952 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.339645 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8t5h"] Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.349252 4870 scope.go:117] "RemoveContainer" containerID="3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.394750 4870 scope.go:117] "RemoveContainer" containerID="1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8" Jan 30 08:25:22 crc kubenswrapper[4870]: E0130 08:25:22.395224 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8\": container with ID starting with 1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8 not found: ID does not exist" containerID="1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.395264 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8"} err="failed to get container status \"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8\": rpc error: code = NotFound desc = could not find container \"1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8\": container with ID starting with 1b7258f0cb3b8d94cfbc81a8d15ba08d50e7cf8aabca07490fec1c5e1133bba8 not found: ID does not exist" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.395296 4870 scope.go:117] "RemoveContainer" containerID="ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9" Jan 30 08:25:22 crc kubenswrapper[4870]: E0130 08:25:22.395712 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9\": container with ID starting with ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9 not found: ID does not exist" containerID="ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.395742 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9"} err="failed to get container status \"ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9\": rpc error: code = NotFound desc = could not find container \"ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9\": container with ID starting with ae2c5a28d1054dd4f49367fe7abafb310d87c558c84f94dd0bfae47916a13ec9 not found: ID does not exist" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.395761 4870 scope.go:117] "RemoveContainer" containerID="3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f" Jan 30 08:25:22 crc kubenswrapper[4870]: E0130 08:25:22.396118 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f\": container with ID starting with 3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f not found: ID does not exist" containerID="3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.396148 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f"} err="failed to get container status \"3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f\": rpc error: code = NotFound desc = could not find container \"3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f\": container with ID starting with 3c926d70117bcd5ef2815b7cdbe130647e38bb075a7c19633b0d0c4561316c4f not found: ID does not exist" Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.433204 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:22 crc kubenswrapper[4870]: I0130 08:25:22.433606 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4nh7" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="registry-server" containerID="cri-o://717be5686005143024a4b1fb0b55c3b441fc39613b640266654be36845057b0c" gracePeriod=2 Jan 30 08:25:24 crc kubenswrapper[4870]: I0130 08:25:24.096831 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" path="/var/lib/kubelet/pods/f2ebdb93-c8ce-45c1-b10f-037853cc99d9/volumes" Jan 30 08:25:24 crc kubenswrapper[4870]: I0130 08:25:24.313215 4870 generic.go:334] "Generic (PLEG): container finished" podID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerID="717be5686005143024a4b1fb0b55c3b441fc39613b640266654be36845057b0c" exitCode=0 Jan 30 08:25:24 crc kubenswrapper[4870]: I0130 08:25:24.313278 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerDied","Data":"717be5686005143024a4b1fb0b55c3b441fc39613b640266654be36845057b0c"} Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.250129 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.251048 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.514104 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.609507 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content\") pod \"d7766425-f469-4513-b62b-e44e3d3f81bc\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.609652 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities\") pod \"d7766425-f469-4513-b62b-e44e3d3f81bc\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.609722 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7z55\" (UniqueName: \"kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55\") pod \"d7766425-f469-4513-b62b-e44e3d3f81bc\" (UID: \"d7766425-f469-4513-b62b-e44e3d3f81bc\") " Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.612047 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities" (OuterVolumeSpecName: "utilities") pod "d7766425-f469-4513-b62b-e44e3d3f81bc" (UID: "d7766425-f469-4513-b62b-e44e3d3f81bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.614892 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55" (OuterVolumeSpecName: "kube-api-access-x7z55") pod "d7766425-f469-4513-b62b-e44e3d3f81bc" (UID: "d7766425-f469-4513-b62b-e44e3d3f81bc"). InnerVolumeSpecName "kube-api-access-x7z55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.678618 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7766425-f469-4513-b62b-e44e3d3f81bc" (UID: "d7766425-f469-4513-b62b-e44e3d3f81bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.711755 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.711792 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7z55\" (UniqueName: \"kubernetes.io/projected/d7766425-f469-4513-b62b-e44e3d3f81bc-kube-api-access-x7z55\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:25 crc kubenswrapper[4870]: I0130 08:25:25.711807 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7766425-f469-4513-b62b-e44e3d3f81bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:26 crc kubenswrapper[4870]: E0130 08:25:26.175420 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.337193 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4nh7" event={"ID":"d7766425-f469-4513-b62b-e44e3d3f81bc","Type":"ContainerDied","Data":"bf15e36614073a765c12f2a92a73edc04b67fdb7bd97803fa465e4ec3897310b"} Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.337843 4870 scope.go:117] "RemoveContainer" containerID="717be5686005143024a4b1fb0b55c3b441fc39613b640266654be36845057b0c" Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.337340 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4nh7" Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.404297 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.409756 4870 scope.go:117] "RemoveContainer" containerID="59dd3f065b079b1913a4f498086fdcee013f59a43677085570cb3e003e1cb9ad" Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.414114 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4nh7"] Jan 30 08:25:26 crc kubenswrapper[4870]: I0130 08:25:26.434429 4870 scope.go:117] "RemoveContainer" containerID="ced22f80742d52f01c7a2a77934c1cc85ecaeb0560f0d90e44530ff08ba4d9a1" Jan 30 08:25:28 crc kubenswrapper[4870]: I0130 08:25:28.092557 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" path="/var/lib/kubelet/pods/d7766425-f469-4513-b62b-e44e3d3f81bc/volumes" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.667356 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668025 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668038 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668050 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="extract-utilities" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668056 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="extract-utilities" Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668069 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="extract-utilities" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668077 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="extract-utilities" Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668088 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668094 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668110 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="extract-content" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668116 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="extract-content" Jan 30 08:25:29 crc kubenswrapper[4870]: E0130 08:25:29.668126 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="extract-content" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668132 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="extract-content" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668259 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ebdb93-c8ce-45c1-b10f-037853cc99d9" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.668268 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7766425-f469-4513-b62b-e44e3d3f81bc" containerName="registry-server" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.669314 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.687290 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.793274 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.793328 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fv4\" (UniqueName: \"kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.793375 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.895078 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.895235 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.895265 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fv4\" (UniqueName: \"kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.895611 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.895971 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:29 crc kubenswrapper[4870]: I0130 08:25:29.919395 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fv4\" (UniqueName: \"kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4\") pod \"community-operators-w8txq\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:30 crc kubenswrapper[4870]: I0130 08:25:30.001397 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:30 crc kubenswrapper[4870]: I0130 08:25:30.515447 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:30 crc kubenswrapper[4870]: W0130 08:25:30.524700 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815f19a2_b916_4782_b47c_84c3e9f7256b.slice/crio-dffa78b9cc8ecf235e2db29e9a60513b5a7e5fad878eed0fcc2cadbd97531a8a WatchSource:0}: Error finding container dffa78b9cc8ecf235e2db29e9a60513b5a7e5fad878eed0fcc2cadbd97531a8a: Status 404 returned error can't find the container with id dffa78b9cc8ecf235e2db29e9a60513b5a7e5fad878eed0fcc2cadbd97531a8a Jan 30 08:25:31 crc kubenswrapper[4870]: I0130 08:25:31.390962 4870 generic.go:334] "Generic (PLEG): container finished" podID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerID="f0ec955f8bd758261b3b0837ffe61d337bbaee4c2b09b72c8be041200568c84d" exitCode=0 Jan 30 08:25:31 crc kubenswrapper[4870]: I0130 08:25:31.391042 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerDied","Data":"f0ec955f8bd758261b3b0837ffe61d337bbaee4c2b09b72c8be041200568c84d"} Jan 30 08:25:31 crc kubenswrapper[4870]: I0130 08:25:31.392209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerStarted","Data":"dffa78b9cc8ecf235e2db29e9a60513b5a7e5fad878eed0fcc2cadbd97531a8a"} Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.401961 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" event={"ID":"b706cc39-6af6-4a91-b2a2-6160148dadae","Type":"ContainerStarted","Data":"41ae688cfff4407f497ec5966bb3d9b09c4f68a691b626233e760ff995c8fab9"} Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.404588 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" event={"ID":"d6956410-92c0-40bf-b1c1-a3353ccf1bbc","Type":"ContainerStarted","Data":"60cadaf796b491d2cdd2165fdf687979e8eeb93a44e6764c727b1055c0524514"} Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.404926 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.409732 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerStarted","Data":"f34b58d15d78e4ed6c3d87597e6195d81d3ae51c781f14d73eeda3ad6761a6b7"} Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.427756 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sds6v" podStartSLOduration=3.284596121 podStartE2EDuration="1m5.427728577s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.42288918 +0000 UTC m=+908.118436289" lastFinishedPulling="2026-01-30 08:25:31.566021626 +0000 UTC m=+970.261568745" observedRunningTime="2026-01-30 08:25:32.424940439 +0000 UTC m=+971.120487568" watchObservedRunningTime="2026-01-30 08:25:32.427728577 +0000 UTC m=+971.123275686" Jan 30 08:25:32 crc kubenswrapper[4870]: I0130 08:25:32.484080 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" podStartSLOduration=2.616284786 podStartE2EDuration="1m5.484053337s" podCreationTimestamp="2026-01-30 08:24:27 +0000 UTC" firstStartedPulling="2026-01-30 08:24:29.300832996 +0000 UTC m=+907.996380105" lastFinishedPulling="2026-01-30 08:25:32.168601537 +0000 UTC m=+970.864148656" observedRunningTime="2026-01-30 08:25:32.474340601 +0000 UTC m=+971.169887740" watchObservedRunningTime="2026-01-30 08:25:32.484053337 +0000 UTC m=+971.179600486" Jan 30 08:25:33 crc kubenswrapper[4870]: I0130 08:25:33.423195 4870 generic.go:334] "Generic (PLEG): container finished" podID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerID="f34b58d15d78e4ed6c3d87597e6195d81d3ae51c781f14d73eeda3ad6761a6b7" exitCode=0 Jan 30 08:25:33 crc kubenswrapper[4870]: I0130 08:25:33.423285 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerDied","Data":"f34b58d15d78e4ed6c3d87597e6195d81d3ae51c781f14d73eeda3ad6761a6b7"} Jan 30 08:25:34 crc kubenswrapper[4870]: I0130 08:25:34.434107 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerStarted","Data":"fb6633fc11d8cb9a25a5e95a83f794bb99d01e445ef450402b6e145aadfc35b8"} Jan 30 08:25:34 crc kubenswrapper[4870]: I0130 08:25:34.465516 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8txq" podStartSLOduration=3.044836489 podStartE2EDuration="5.465491656s" podCreationTimestamp="2026-01-30 08:25:29 +0000 UTC" firstStartedPulling="2026-01-30 08:25:31.393159636 +0000 UTC m=+970.088706775" lastFinishedPulling="2026-01-30 08:25:33.813814803 +0000 UTC m=+972.509361942" observedRunningTime="2026-01-30 08:25:34.456014439 +0000 UTC m=+973.151561568" watchObservedRunningTime="2026-01-30 08:25:34.465491656 +0000 UTC m=+973.161038785" Jan 30 08:25:36 crc kubenswrapper[4870]: E0130 08:25:36.387101 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:25:38 crc kubenswrapper[4870]: I0130 08:25:38.417187 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7b7dd57594-2p68v" Jan 30 08:25:40 crc kubenswrapper[4870]: I0130 08:25:40.002444 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:40 crc kubenswrapper[4870]: I0130 08:25:40.002505 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:40 crc kubenswrapper[4870]: I0130 08:25:40.059229 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:40 crc kubenswrapper[4870]: I0130 08:25:40.652848 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:41 crc kubenswrapper[4870]: I0130 08:25:41.028393 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:42 crc kubenswrapper[4870]: I0130 08:25:42.598577 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8txq" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="registry-server" containerID="cri-o://fb6633fc11d8cb9a25a5e95a83f794bb99d01e445ef450402b6e145aadfc35b8" gracePeriod=2 Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.623436 4870 generic.go:334] "Generic (PLEG): container finished" podID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerID="fb6633fc11d8cb9a25a5e95a83f794bb99d01e445ef450402b6e145aadfc35b8" exitCode=0 Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.623774 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerDied","Data":"fb6633fc11d8cb9a25a5e95a83f794bb99d01e445ef450402b6e145aadfc35b8"} Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.907915 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.981181 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities\") pod \"815f19a2-b916-4782-b47c-84c3e9f7256b\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.981257 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fv4\" (UniqueName: \"kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4\") pod \"815f19a2-b916-4782-b47c-84c3e9f7256b\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.981282 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content\") pod \"815f19a2-b916-4782-b47c-84c3e9f7256b\" (UID: \"815f19a2-b916-4782-b47c-84c3e9f7256b\") " Jan 30 08:25:43 crc kubenswrapper[4870]: I0130 08:25:43.982178 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities" (OuterVolumeSpecName: "utilities") pod "815f19a2-b916-4782-b47c-84c3e9f7256b" (UID: "815f19a2-b916-4782-b47c-84c3e9f7256b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.011285 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4" (OuterVolumeSpecName: "kube-api-access-q8fv4") pod "815f19a2-b916-4782-b47c-84c3e9f7256b" (UID: "815f19a2-b916-4782-b47c-84c3e9f7256b"). InnerVolumeSpecName "kube-api-access-q8fv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.082382 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.082409 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fv4\" (UniqueName: \"kubernetes.io/projected/815f19a2-b916-4782-b47c-84c3e9f7256b-kube-api-access-q8fv4\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.365682 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "815f19a2-b916-4782-b47c-84c3e9f7256b" (UID: "815f19a2-b916-4782-b47c-84c3e9f7256b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.388325 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815f19a2-b916-4782-b47c-84c3e9f7256b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.640091 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8txq" event={"ID":"815f19a2-b916-4782-b47c-84c3e9f7256b","Type":"ContainerDied","Data":"dffa78b9cc8ecf235e2db29e9a60513b5a7e5fad878eed0fcc2cadbd97531a8a"} Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.640155 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8txq" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.640206 4870 scope.go:117] "RemoveContainer" containerID="fb6633fc11d8cb9a25a5e95a83f794bb99d01e445ef450402b6e145aadfc35b8" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.673573 4870 scope.go:117] "RemoveContainer" containerID="f34b58d15d78e4ed6c3d87597e6195d81d3ae51c781f14d73eeda3ad6761a6b7" Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.705751 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.714072 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8txq"] Jan 30 08:25:44 crc kubenswrapper[4870]: I0130 08:25:44.729071 4870 scope.go:117] "RemoveContainer" containerID="f0ec955f8bd758261b3b0837ffe61d337bbaee4c2b09b72c8be041200568c84d" Jan 30 08:25:46 crc kubenswrapper[4870]: I0130 08:25:46.088612 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" path="/var/lib/kubelet/pods/815f19a2-b916-4782-b47c-84c3e9f7256b/volumes" Jan 30 08:25:46 crc kubenswrapper[4870]: E0130 08:25:46.649534 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.250249 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.250901 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.250972 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.251968 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.252067 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac" gracePeriod=600 Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.729655 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac" exitCode=0 Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.729729 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac"} Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.730005 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae"} Jan 30 08:25:55 crc kubenswrapper[4870]: I0130 08:25:55.730027 4870 scope.go:117] "RemoveContainer" containerID="8f05305445b605660ea999aab22b621a1da0c30929b1af3251f46624decd30be" Jan 30 08:25:56 crc kubenswrapper[4870]: E0130 08:25:56.841411 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.192477 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:00 crc kubenswrapper[4870]: E0130 08:26:00.193453 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="extract-content" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.193470 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="extract-content" Jan 30 08:26:00 crc kubenswrapper[4870]: E0130 08:26:00.193484 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="registry-server" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.193492 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="registry-server" Jan 30 08:26:00 crc kubenswrapper[4870]: E0130 08:26:00.193507 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="extract-utilities" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.193516 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="extract-utilities" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.193699 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="815f19a2-b916-4782-b47c-84c3e9f7256b" containerName="registry-server" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.194699 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.197769 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.197857 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.197940 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.198555 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-52wst" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.200451 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.241487 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.241542 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwpxs\" (UniqueName: \"kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.252176 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.254131 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.256245 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.265245 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.343056 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjl4f\" (UniqueName: \"kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.343149 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.343185 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwpxs\" (UniqueName: \"kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.343209 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.343230 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.344048 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.360568 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwpxs\" (UniqueName: \"kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs\") pod \"dnsmasq-dns-9cd4f5bf5-n8lzz\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.444466 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.444688 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.444867 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjl4f\" (UniqueName: \"kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.445411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.445415 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.466592 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjl4f\" (UniqueName: \"kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f\") pod \"dnsmasq-dns-6dd95798b9-btgvj\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.516596 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:00 crc kubenswrapper[4870]: I0130 08:26:00.572459 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:01 crc kubenswrapper[4870]: I0130 08:26:01.044033 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:01 crc kubenswrapper[4870]: W0130 08:26:01.130730 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c2e9ac8_fed6_4f2e_9a1b_26e0b253a3d2.slice/crio-954aad0a8aef2531b8fe421c82e8e2a545985a769437ed6265789109329f05be WatchSource:0}: Error finding container 954aad0a8aef2531b8fe421c82e8e2a545985a769437ed6265789109329f05be: Status 404 returned error can't find the container with id 954aad0a8aef2531b8fe421c82e8e2a545985a769437ed6265789109329f05be Jan 30 08:26:01 crc kubenswrapper[4870]: I0130 08:26:01.132181 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:01 crc kubenswrapper[4870]: I0130 08:26:01.782339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" event={"ID":"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2","Type":"ContainerStarted","Data":"954aad0a8aef2531b8fe421c82e8e2a545985a769437ed6265789109329f05be"} Jan 30 08:26:01 crc kubenswrapper[4870]: I0130 08:26:01.783909 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" event={"ID":"c135f9a2-386b-4108-a40d-a703e4d72b13","Type":"ContainerStarted","Data":"85ed67ebbca181d72c31532b0492443d898520cd3d920466a779ed625e90274a"} Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.010779 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.030699 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.032079 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.048535 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.224739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wbn\" (UniqueName: \"kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.224954 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.225031 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.328226 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.328477 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.328507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wbn\" (UniqueName: \"kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.329613 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.331161 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.346828 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.360106 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wbn\" (UniqueName: \"kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn\") pod \"dnsmasq-dns-64594fd94f-bp9gj\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.406669 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.417513 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.429725 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.429825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd26p\" (UniqueName: \"kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.429857 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.433248 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.530621 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.530703 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd26p\" (UniqueName: \"kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.530730 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.531473 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.532003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.551608 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd26p\" (UniqueName: \"kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p\") pod \"dnsmasq-dns-7d56d856cf-n69v7\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.648662 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.748464 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.766847 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.767111 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.768175 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.785822 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.835141 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pkdd\" (UniqueName: \"kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.835215 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.835248 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.936498 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pkdd\" (UniqueName: \"kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.937476 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.937508 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.939204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.939371 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:04 crc kubenswrapper[4870]: I0130 08:26:04.971813 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pkdd\" (UniqueName: \"kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd\") pod \"dnsmasq-dns-57467f675c-j7lcp\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.096030 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.178551 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.179949 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.182579 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.183204 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.183438 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.185734 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lwd7k" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.185744 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.185909 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.186222 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.192258 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343600 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343641 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343701 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343736 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9pp\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343765 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343786 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343799 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343822 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343851 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343867 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.343905 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444798 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444843 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444865 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444920 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444960 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9pp\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.444992 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445011 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445028 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445052 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445083 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445102 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445449 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445470 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.445527 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.446333 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.446778 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.447528 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.449602 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.450229 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.450428 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.452819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.465153 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9pp\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.471490 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.511857 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.526330 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.528475 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.530670 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.530853 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.531340 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.531520 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.531744 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.532776 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.532814 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hr5rb" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.551092 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648202 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648295 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648433 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648495 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4d8r\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648533 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648559 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648667 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648697 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648723 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648747 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.648832 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750328 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750380 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750432 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750464 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750491 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750514 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750550 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750653 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750689 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.750715 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4d8r\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.751087 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.751166 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.751279 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.751933 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.752196 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.752516 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.758714 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.759537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.762379 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.764940 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.776803 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.779956 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4d8r\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.867928 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.894388 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.895603 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.899159 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.899641 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.899974 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-lrn99" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.900130 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.900348 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.901310 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.901624 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Jan 30 08:26:05 crc kubenswrapper[4870]: I0130 08:26:05.905413 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054147 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054188 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054218 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054239 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ab884a9-b47a-476a-8f89-140093b96527-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054272 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054317 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054498 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpv8h\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-kube-api-access-tpv8h\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054566 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054633 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054697 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ab884a9-b47a-476a-8f89-140093b96527-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.054729 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156355 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156405 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156430 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ab884a9-b47a-476a-8f89-140093b96527-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156465 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156488 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156531 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpv8h\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-kube-api-access-tpv8h\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156548 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156592 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ab884a9-b47a-476a-8f89-140093b96527-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156607 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156630 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.156754 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.158287 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.158335 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.159163 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.159259 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.159276 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ab884a9-b47a-476a-8f89-140093b96527-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.163200 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ab884a9-b47a-476a-8f89-140093b96527-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.166519 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.177182 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.177250 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ab884a9-b47a-476a-8f89-140093b96527-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.182566 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpv8h\" (UniqueName: \"kubernetes.io/projected/2ab884a9-b47a-476a-8f89-140093b96527-kube-api-access-tpv8h\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.194962 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"2ab884a9-b47a-476a-8f89-140093b96527\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:06 crc kubenswrapper[4870]: I0130 08:26:06.227385 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:26:07 crc kubenswrapper[4870]: E0130 08:26:07.024368 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.209242 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.210676 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.213674 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x4qpj" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.216190 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.217047 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.227864 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.230266 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.255244 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374264 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374360 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zcm4\" (UniqueName: \"kubernetes.io/projected/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kube-api-access-2zcm4\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374408 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374429 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kolla-config\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374474 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-default\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374512 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374577 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.374603 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475726 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475777 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475806 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475847 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zcm4\" (UniqueName: \"kubernetes.io/projected/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kube-api-access-2zcm4\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475904 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.475945 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kolla-config\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.476004 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-default\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.476043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.476354 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.477570 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kolla-config\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.477754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-default\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.479560 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.479774 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.484801 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.485437 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.498163 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.501015 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zcm4\" (UniqueName: \"kubernetes.io/projected/2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a-kube-api-access-2zcm4\") pod \"openstack-galera-0\" (UID: \"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a\") " pod="openstack/openstack-galera-0" Jan 30 08:26:07 crc kubenswrapper[4870]: I0130 08:26:07.543117 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.741319 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.743026 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.748672 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mqkt7" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.751109 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.751424 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.754679 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.765851 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.796720 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.796853 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.796961 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.797054 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.798977 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.799024 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.799132 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.799222 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4xc\" (UniqueName: \"kubernetes.io/projected/31607550-5ccc-4b0b-9fbd-18007a61dcff-kube-api-access-xv4xc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.900818 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.900914 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4xc\" (UniqueName: \"kubernetes.io/projected/31607550-5ccc-4b0b-9fbd-18007a61dcff-kube-api-access-xv4xc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.900959 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901054 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901093 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901129 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901157 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901448 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901675 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.901679 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.902176 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.903361 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31607550-5ccc-4b0b-9fbd-18007a61dcff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.905679 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.907831 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31607550-5ccc-4b0b-9fbd-18007a61dcff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.929175 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4xc\" (UniqueName: \"kubernetes.io/projected/31607550-5ccc-4b0b-9fbd-18007a61dcff-kube-api-access-xv4xc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:08 crc kubenswrapper[4870]: I0130 08:26:08.932258 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"31607550-5ccc-4b0b-9fbd-18007a61dcff\") " pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.041954 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.043128 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.045406 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.045602 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.051999 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.055733 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fbd9c" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.105273 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsnf\" (UniqueName: \"kubernetes.io/projected/d691b652-0077-4709-9e9d-16b87c8d3d3c-kube-api-access-qpsnf\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.105327 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.105480 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.105671 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-config-data\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.105835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-kolla-config\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.110911 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.207175 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.207369 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.207475 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-config-data\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.207602 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-kolla-config\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.207650 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsnf\" (UniqueName: \"kubernetes.io/projected/d691b652-0077-4709-9e9d-16b87c8d3d3c-kube-api-access-qpsnf\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.209197 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-kolla-config\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.209204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d691b652-0077-4709-9e9d-16b87c8d3d3c-config-data\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.214330 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.218511 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d691b652-0077-4709-9e9d-16b87c8d3d3c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.238715 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsnf\" (UniqueName: \"kubernetes.io/projected/d691b652-0077-4709-9e9d-16b87c8d3d3c-kube-api-access-qpsnf\") pod \"memcached-0\" (UID: \"d691b652-0077-4709-9e9d-16b87c8d3d3c\") " pod="openstack/memcached-0" Jan 30 08:26:09 crc kubenswrapper[4870]: I0130 08:26:09.358932 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.780420 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.781492 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.787770 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5g22f" Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.798376 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.848731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckp8w\" (UniqueName: \"kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w\") pod \"kube-state-metrics-0\" (UID: \"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67\") " pod="openstack/kube-state-metrics-0" Jan 30 08:26:10 crc kubenswrapper[4870]: I0130 08:26:10.949629 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckp8w\" (UniqueName: \"kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w\") pod \"kube-state-metrics-0\" (UID: \"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67\") " pod="openstack/kube-state-metrics-0" Jan 30 08:26:11 crc kubenswrapper[4870]: I0130 08:26:11.002742 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckp8w\" (UniqueName: \"kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w\") pod \"kube-state-metrics-0\" (UID: \"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67\") " pod="openstack/kube-state-metrics-0" Jan 30 08:26:11 crc kubenswrapper[4870]: I0130 08:26:11.095903 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.274322 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.276449 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.279362 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.279565 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.280629 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.280754 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.280930 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.281046 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.281063 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-88lql" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.285125 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.299313 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372116 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5p8\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372194 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372235 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372265 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372316 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372345 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372378 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372433 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372461 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.372506 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5p8\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473524 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473566 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473600 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473678 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473725 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473772 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473931 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.473985 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.474040 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.475362 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.476051 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.476149 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.478349 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.478898 4870 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.478938 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b608408b27cf3925c08af2a9b3a133a2b5eb87db3a290a5641371b0533b7f7d2/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.478905 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.479509 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.487617 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.488280 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.500228 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5p8\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.512429 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:12 crc kubenswrapper[4870]: I0130 08:26:12.613396 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.446352 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rwchz"] Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.447967 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.450826 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zmt2p" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.451216 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.451397 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.455339 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gznh8"] Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.457423 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.462569 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz"] Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.470406 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gznh8"] Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519005 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-log\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519347 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpss8\" (UniqueName: \"kubernetes.io/projected/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-kube-api-access-dpss8\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519422 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-etc-ovs\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519466 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-log-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519598 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-run\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519637 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-combined-ca-bundle\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519803 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-scripts\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519870 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-lib\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.519966 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.520064 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496b707b-8de6-4228-b4fd-a48f3709586c-scripts\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.520105 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-ovn-controller-tls-certs\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.520175 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2p6x\" (UniqueName: \"kubernetes.io/projected/496b707b-8de6-4228-b4fd-a48f3709586c-kube-api-access-m2p6x\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.547464 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621560 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-lib\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621608 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621643 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496b707b-8de6-4228-b4fd-a48f3709586c-scripts\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621663 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-ovn-controller-tls-certs\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621694 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2p6x\" (UniqueName: \"kubernetes.io/projected/496b707b-8de6-4228-b4fd-a48f3709586c-kube-api-access-m2p6x\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621719 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-log\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621741 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpss8\" (UniqueName: \"kubernetes.io/projected/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-kube-api-access-dpss8\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621772 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-etc-ovs\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621796 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-log-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621818 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621847 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-run\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621864 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-combined-ca-bundle\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.621950 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-scripts\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.622162 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-etc-ovs\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.624484 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-scripts\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.624484 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/496b707b-8de6-4228-b4fd-a48f3709586c-scripts\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627176 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627628 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-run\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627680 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-log\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627707 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-run\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627769 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-var-lib\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.627760 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/496b707b-8de6-4228-b4fd-a48f3709586c-var-log-ovn\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.629916 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-combined-ca-bundle\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.638605 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/496b707b-8de6-4228-b4fd-a48f3709586c-ovn-controller-tls-certs\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.640222 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2p6x\" (UniqueName: \"kubernetes.io/projected/496b707b-8de6-4228-b4fd-a48f3709586c-kube-api-access-m2p6x\") pod \"ovn-controller-rwchz\" (UID: \"496b707b-8de6-4228-b4fd-a48f3709586c\") " pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.645621 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpss8\" (UniqueName: \"kubernetes.io/projected/b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2-kube-api-access-dpss8\") pod \"ovn-controller-ovs-gznh8\" (UID: \"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2\") " pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.779694 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz" Jan 30 08:26:14 crc kubenswrapper[4870]: I0130 08:26:14.792614 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.319640 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.321612 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.324601 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-72pm2" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.328060 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.328361 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.331333 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.334470 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.337679 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435271 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435339 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-config\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435508 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmxrq\" (UniqueName: \"kubernetes.io/projected/625f2d84-6699-4e9f-881e-e96509760e9d-kube-api-access-vmxrq\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435610 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435681 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435807 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.435904 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537277 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537323 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537361 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-config\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537401 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmxrq\" (UniqueName: \"kubernetes.io/projected/625f2d84-6699-4e9f-881e-e96509760e9d-kube-api-access-vmxrq\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537438 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537470 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.537494 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.538202 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.538614 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.541441 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.542764 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f2d84-6699-4e9f-881e-e96509760e9d-config\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.543449 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.548247 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.556471 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2d84-6699-4e9f-881e-e96509760e9d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.557292 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmxrq\" (UniqueName: \"kubernetes.io/projected/625f2d84-6699-4e9f-881e-e96509760e9d-kube-api-access-vmxrq\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.565599 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"625f2d84-6699-4e9f-881e-e96509760e9d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:15 crc kubenswrapper[4870]: I0130 08:26:15.684966 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:17 crc kubenswrapper[4870]: E0130 08:26:17.286027 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ebdb93_c8ce_45c1_b10f_037853cc99d9.slice/crio-424c2889668056eeecef60e4a12c02f8812311e06dd82e3b2b01529aeee939aa\": RecentStats: unable to find data in memory cache]" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.487717 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.489283 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.494686 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.494953 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.495045 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wgfkw" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.495236 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.504894 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608215 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608344 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608401 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw7h5\" (UniqueName: \"kubernetes.io/projected/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-kube-api-access-xw7h5\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608537 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608584 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608613 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608720 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.608760 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-config\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712093 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712158 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712199 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712275 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712306 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-config\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712330 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712388 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712420 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw7h5\" (UniqueName: \"kubernetes.io/projected/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-kube-api-access-xw7h5\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.712514 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.713093 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.713619 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.714061 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-config\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.722851 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.723010 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.723059 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.756783 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw7h5\" (UniqueName: \"kubernetes.io/projected/e9a5fd23-1240-4284-91cf-b57f4b2e3d02-kube-api-access-xw7h5\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.767708 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e9a5fd23-1240-4284-91cf-b57f4b2e3d02\") " pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:18 crc kubenswrapper[4870]: I0130 08:26:18.824912 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:19 crc kubenswrapper[4870]: W0130 08:26:19.137679 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod033dbc66_0baa_46b3_8fda_3881303e4e40.slice/crio-be7d1aa3f3670be6d94e4a7facc9d6deaa6f49ecc11c230b4a15c8f46a3117b6 WatchSource:0}: Error finding container be7d1aa3f3670be6d94e4a7facc9d6deaa6f49ecc11c230b4a15c8f46a3117b6: Status 404 returned error can't find the container with id be7d1aa3f3670be6d94e4a7facc9d6deaa6f49ecc11c230b4a15c8f46a3117b6 Jan 30 08:26:19 crc kubenswrapper[4870]: I0130 08:26:19.992807 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" event={"ID":"033dbc66-0baa-46b3-8fda-3881303e4e40","Type":"ContainerStarted","Data":"be7d1aa3f3670be6d94e4a7facc9d6deaa6f49ecc11c230b4a15c8f46a3117b6"} Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.048315 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.048378 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.048495 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjl4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6dd95798b9-btgvj_openstack(c135f9a2-386b-4108-a40d-a703e4d72b13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.049655 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" podUID="c135f9a2-386b-4108-a40d-a703e4d72b13" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.097183 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.097237 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.097349 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwpxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-9cd4f5bf5-n8lzz_openstack(7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:26:20 crc kubenswrapper[4870]: E0130 08:26:20.098500 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" podUID="7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2" Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.400151 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.649775 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 08:26:20 crc kubenswrapper[4870]: W0130 08:26:20.656500 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd691b652_0077_4709_9e9d_16b87c8d3d3c.slice/crio-af6261a516f97db7788d12843796598bca22d6a5607376e0bbfd238a94ac03f5 WatchSource:0}: Error finding container af6261a516f97db7788d12843796598bca22d6a5607376e0bbfd238a94ac03f5: Status 404 returned error can't find the container with id af6261a516f97db7788d12843796598bca22d6a5607376e0bbfd238a94ac03f5 Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.656647 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 08:26:20 crc kubenswrapper[4870]: W0130 08:26:20.659174 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3b1e9c_90bb_46b7_8e19_edc1388b2a67.slice/crio-e0a5d30d3c77180ecb50a167671b3d1f7955018f6057ec5124abb113c1fa8b6f WatchSource:0}: Error finding container e0a5d30d3c77180ecb50a167671b3d1f7955018f6057ec5124abb113c1fa8b6f: Status 404 returned error can't find the container with id e0a5d30d3c77180ecb50a167671b3d1f7955018f6057ec5124abb113c1fa8b6f Jan 30 08:26:20 crc kubenswrapper[4870]: W0130 08:26:20.660778 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ce45bb8_e721_40bb_a9fb_ac0d6b0deb4a.slice/crio-b4a40d16fc5fb149c0fc174da87c73a5c696d7b815a99e3d16fd9b278a7ad8d2 WatchSource:0}: Error finding container b4a40d16fc5fb149c0fc174da87c73a5c696d7b815a99e3d16fd9b278a7ad8d2: Status 404 returned error can't find the container with id b4a40d16fc5fb149c0fc174da87c73a5c696d7b815a99e3d16fd9b278a7ad8d2 Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.662930 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.668479 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:26:20 crc kubenswrapper[4870]: I0130 08:26:20.674717 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.000622 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerStarted","Data":"e007871b6d10423ef6514301a7948e0b65aeec9e801d811cb06f4a5040316a29"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.001718 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67","Type":"ContainerStarted","Data":"e0a5d30d3c77180ecb50a167671b3d1f7955018f6057ec5124abb113c1fa8b6f"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.003153 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d691b652-0077-4709-9e9d-16b87c8d3d3c","Type":"ContainerStarted","Data":"af6261a516f97db7788d12843796598bca22d6a5607376e0bbfd238a94ac03f5"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.004101 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31607550-5ccc-4b0b-9fbd-18007a61dcff","Type":"ContainerStarted","Data":"99ab441e631a7a34bd07c114bbba4517a1c594a32c4de2267edc0cab42d7f3d5"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.005069 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerStarted","Data":"3f0499acc4a6b0c8f2d313af1131c23462a36b6d1d5cfab2eb6312a0f9c1c357"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.006935 4870 generic.go:334] "Generic (PLEG): container finished" podID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerID="fa2967a57d46eadef06dee8c0ac950d7fe80b8807314b92bceff366e74a2aaa8" exitCode=0 Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.007008 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" event={"ID":"033dbc66-0baa-46b3-8fda-3881303e4e40","Type":"ContainerDied","Data":"fa2967a57d46eadef06dee8c0ac950d7fe80b8807314b92bceff366e74a2aaa8"} Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.009506 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a","Type":"ContainerStarted","Data":"b4a40d16fc5fb149c0fc174da87c73a5c696d7b815a99e3d16fd9b278a7ad8d2"} Jan 30 08:26:21 crc kubenswrapper[4870]: E0130 08:26:21.066975 4870 mount_linux.go:282] Mount failed: exit status 32 Jan 30 08:26:21 crc kubenswrapper[4870]: Mounting command: mount Jan 30 08:26:21 crc kubenswrapper[4870]: Mounting arguments: --no-canonicalize -o bind /proc/4870/fd/26 /var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volume-subpaths/dns-svc/dnsmasq-dns/1 Jan 30 08:26:21 crc kubenswrapper[4870]: Output: mount: /var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.078788 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:21 crc kubenswrapper[4870]: E0130 08:26:21.101972 4870 kubelet_pods.go:349] "Failed to prepare subPath for volumeMount of the container" err=< Jan 30 08:26:21 crc kubenswrapper[4870]: error mounting /var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volumes/kubernetes.io~configmap/dns-svc/..2026_01_30_08_26_04.1033670092/dns-svc: mount failed: exit status 32 Jan 30 08:26:21 crc kubenswrapper[4870]: Mounting command: mount Jan 30 08:26:21 crc kubenswrapper[4870]: Mounting arguments: --no-canonicalize -o bind /proc/4870/fd/26 /var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volume-subpaths/dns-svc/dnsmasq-dns/1 Jan 30 08:26:21 crc kubenswrapper[4870]: Output: mount: /var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volume-subpaths/dns-svc/dnsmasq-dns/1: mount(2) system call failed: No such file or directory. Jan 30 08:26:21 crc kubenswrapper[4870]: > containerName="dnsmasq-dns" volumeMountName="dns-svc" Jan 30 08:26:21 crc kubenswrapper[4870]: W0130 08:26:21.102160 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cb8c3d_2157_4c52_a196_24d514b098ee.slice/crio-166a34a595f3784fb50b62bea8c6b3aecbe48d2fb8d288bfed3951f4b3b662b9 WatchSource:0}: Error finding container 166a34a595f3784fb50b62bea8c6b3aecbe48d2fb8d288bfed3951f4b3b662b9: Status 404 returned error can't find the container with id 166a34a595f3784fb50b62bea8c6b3aecbe48d2fb8d288bfed3951f4b3b662b9 Jan 30 08:26:21 crc kubenswrapper[4870]: E0130 08:26:21.102139 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:dnsmasq-dns,Image:38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd26p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7d56d856cf-n69v7_openstack(033dbc66-0baa-46b3-8fda-3881303e4e40): CreateContainerConfigError: failed to prepare subPath for volumeMount \"dns-svc\" of container \"dnsmasq-dns\"" logger="UnhandledError" Jan 30 08:26:21 crc kubenswrapper[4870]: E0130 08:26:21.105553 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerConfigError: \"failed to prepare subPath for volumeMount \\\"dns-svc\\\" of container \\\"dnsmasq-dns\\\"\"" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.160056 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.182185 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.195648 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz"] Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.211182 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 08:26:21 crc kubenswrapper[4870]: W0130 08:26:21.236857 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab884a9_b47a_476a_8f89_140093b96527.slice/crio-b7b01500f518cd5372801924da9b7f6f7f843abe8b3213b1c7678796b6012613 WatchSource:0}: Error finding container b7b01500f518cd5372801924da9b7f6f7f843abe8b3213b1c7678796b6012613: Status 404 returned error can't find the container with id b7b01500f518cd5372801924da9b7f6f7f843abe8b3213b1c7678796b6012613 Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.267150 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 08:26:21 crc kubenswrapper[4870]: W0130 08:26:21.284234 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625f2d84_6699_4e9f_881e_e96509760e9d.slice/crio-34c67fa453057204c29fb530e7edee04517cc65eea6c392dcd2f034238d625bf WatchSource:0}: Error finding container 34c67fa453057204c29fb530e7edee04517cc65eea6c392dcd2f034238d625bf: Status 404 returned error can't find the container with id 34c67fa453057204c29fb530e7edee04517cc65eea6c392dcd2f034238d625bf Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.554404 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.680043 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.680216 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config\") pod \"c135f9a2-386b-4108-a40d-a703e4d72b13\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.680347 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjl4f\" (UniqueName: \"kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f\") pod \"c135f9a2-386b-4108-a40d-a703e4d72b13\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.680421 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc\") pod \"c135f9a2-386b-4108-a40d-a703e4d72b13\" (UID: \"c135f9a2-386b-4108-a40d-a703e4d72b13\") " Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.681061 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c135f9a2-386b-4108-a40d-a703e4d72b13" (UID: "c135f9a2-386b-4108-a40d-a703e4d72b13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.681049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config" (OuterVolumeSpecName: "config") pod "c135f9a2-386b-4108-a40d-a703e4d72b13" (UID: "c135f9a2-386b-4108-a40d-a703e4d72b13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.685943 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f" (OuterVolumeSpecName: "kube-api-access-hjl4f") pod "c135f9a2-386b-4108-a40d-a703e4d72b13" (UID: "c135f9a2-386b-4108-a40d-a703e4d72b13"). InnerVolumeSpecName "kube-api-access-hjl4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.782252 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwpxs\" (UniqueName: \"kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs\") pod \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.782548 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config\") pod \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\" (UID: \"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2\") " Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.783020 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.783300 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c135f9a2-386b-4108-a40d-a703e4d72b13-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.783311 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjl4f\" (UniqueName: \"kubernetes.io/projected/c135f9a2-386b-4108-a40d-a703e4d72b13-kube-api-access-hjl4f\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.783230 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config" (OuterVolumeSpecName: "config") pod "7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2" (UID: "7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.786398 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs" (OuterVolumeSpecName: "kube-api-access-bwpxs") pod "7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2" (UID: "7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2"). InnerVolumeSpecName "kube-api-access-bwpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.886637 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:21 crc kubenswrapper[4870]: I0130 08:26:21.887013 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwpxs\" (UniqueName: \"kubernetes.io/projected/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2-kube-api-access-bwpxs\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.025399 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz" event={"ID":"496b707b-8de6-4228-b4fd-a48f3709586c","Type":"ContainerStarted","Data":"2a49278dde0d40d9881be346be437fe8d7204dd6aa615f031f43d2552146f652"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.027572 4870 generic.go:334] "Generic (PLEG): container finished" podID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerID="f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc" exitCode=0 Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.027629 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" event={"ID":"a7cb8c3d-2157-4c52-a196-24d514b098ee","Type":"ContainerDied","Data":"f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.027648 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" event={"ID":"a7cb8c3d-2157-4c52-a196-24d514b098ee","Type":"ContainerStarted","Data":"166a34a595f3784fb50b62bea8c6b3aecbe48d2fb8d288bfed3951f4b3b662b9"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.030535 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerStarted","Data":"709385d651d4bcb103b7d7d0e2928451ab8b488203130eb1f7baa5322860b0f5"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.032016 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"2ab884a9-b47a-476a-8f89-140093b96527","Type":"ContainerStarted","Data":"b7b01500f518cd5372801924da9b7f6f7f843abe8b3213b1c7678796b6012613"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.034516 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.037339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd95798b9-btgvj" event={"ID":"c135f9a2-386b-4108-a40d-a703e4d72b13","Type":"ContainerDied","Data":"85ed67ebbca181d72c31532b0492443d898520cd3d920466a779ed625e90274a"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.086149 4870 generic.go:334] "Generic (PLEG): container finished" podID="f491adde-145d-44fc-9414-0fd92c41a114" containerID="e0bc981d691db0b6447c821e168e88197478822865e6c93bfc11c3c320369da7" exitCode=0 Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.137837 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.140465 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" event={"ID":"f491adde-145d-44fc-9414-0fd92c41a114","Type":"ContainerDied","Data":"e0bc981d691db0b6447c821e168e88197478822865e6c93bfc11c3c320369da7"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.140516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" event={"ID":"f491adde-145d-44fc-9414-0fd92c41a114","Type":"ContainerStarted","Data":"86a61707384abe477056d320911123a063b0ba5f70255594ea4531c5ddc156a1"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.140532 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"625f2d84-6699-4e9f-881e-e96509760e9d","Type":"ContainerStarted","Data":"34c67fa453057204c29fb530e7edee04517cc65eea6c392dcd2f034238d625bf"} Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.140548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9cd4f5bf5-n8lzz" event={"ID":"7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2","Type":"ContainerDied","Data":"954aad0a8aef2531b8fe421c82e8e2a545985a769437ed6265789109329f05be"} Jan 30 08:26:22 crc kubenswrapper[4870]: E0130 08:26:22.143504 4870 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7b3642e2a85c0a9325578b22f6081309876b1e1994b22fd2150d35aa4cf293fe/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7b3642e2a85c0a9325578b22f6081309876b1e1994b22fd2150d35aa4cf293fe/diff: no such file or directory, extraDiskErr: Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.153770 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gznh8"] Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.520298 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.535801 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9cd4f5bf5-n8lzz"] Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.552951 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.565678 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dd95798b9-btgvj"] Jan 30 08:26:22 crc kubenswrapper[4870]: I0130 08:26:22.816561 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 08:26:24 crc kubenswrapper[4870]: I0130 08:26:24.092419 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2" path="/var/lib/kubelet/pods/7c2e9ac8-fed6-4f2e-9a1b-26e0b253a3d2/volumes" Jan 30 08:26:24 crc kubenswrapper[4870]: I0130 08:26:24.093279 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c135f9a2-386b-4108-a40d-a703e4d72b13" path="/var/lib/kubelet/pods/c135f9a2-386b-4108-a40d-a703e4d72b13/volumes" Jan 30 08:26:24 crc kubenswrapper[4870]: I0130 08:26:24.156095 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gznh8" event={"ID":"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2","Type":"ContainerStarted","Data":"156d64623cdcbda588e81e7d32516ad1ea6979270ad5c6324909bcb2aa418fc4"} Jan 30 08:26:24 crc kubenswrapper[4870]: W0130 08:26:24.526292 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9a5fd23_1240_4284_91cf_b57f4b2e3d02.slice/crio-d7ec8a87b0ed4e5b5483de9c1bc68ccfc75f940481071af5486f21894eb3aa9e WatchSource:0}: Error finding container d7ec8a87b0ed4e5b5483de9c1bc68ccfc75f940481071af5486f21894eb3aa9e: Status 404 returned error can't find the container with id d7ec8a87b0ed4e5b5483de9c1bc68ccfc75f940481071af5486f21894eb3aa9e Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.046977 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.165996 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e9a5fd23-1240-4284-91cf-b57f4b2e3d02","Type":"ContainerStarted","Data":"d7ec8a87b0ed4e5b5483de9c1bc68ccfc75f940481071af5486f21894eb3aa9e"} Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.168144 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" event={"ID":"f491adde-145d-44fc-9414-0fd92c41a114","Type":"ContainerDied","Data":"86a61707384abe477056d320911123a063b0ba5f70255594ea4531c5ddc156a1"} Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.168179 4870 scope.go:117] "RemoveContainer" containerID="e0bc981d691db0b6447c821e168e88197478822865e6c93bfc11c3c320369da7" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.168305 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64594fd94f-bp9gj" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.193754 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config\") pod \"f491adde-145d-44fc-9414-0fd92c41a114\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.193809 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc\") pod \"f491adde-145d-44fc-9414-0fd92c41a114\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.193888 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wbn\" (UniqueName: \"kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn\") pod \"f491adde-145d-44fc-9414-0fd92c41a114\" (UID: \"f491adde-145d-44fc-9414-0fd92c41a114\") " Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.198251 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn" (OuterVolumeSpecName: "kube-api-access-26wbn") pod "f491adde-145d-44fc-9414-0fd92c41a114" (UID: "f491adde-145d-44fc-9414-0fd92c41a114"). InnerVolumeSpecName "kube-api-access-26wbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.216503 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config" (OuterVolumeSpecName: "config") pod "f491adde-145d-44fc-9414-0fd92c41a114" (UID: "f491adde-145d-44fc-9414-0fd92c41a114"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.233774 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f491adde-145d-44fc-9414-0fd92c41a114" (UID: "f491adde-145d-44fc-9414-0fd92c41a114"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.295856 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.295903 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wbn\" (UniqueName: \"kubernetes.io/projected/f491adde-145d-44fc-9414-0fd92c41a114-kube-api-access-26wbn\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.295915 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f491adde-145d-44fc-9414-0fd92c41a114-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.525226 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:25 crc kubenswrapper[4870]: I0130 08:26:25.525294 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64594fd94f-bp9gj"] Jan 30 08:26:26 crc kubenswrapper[4870]: I0130 08:26:26.090140 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f491adde-145d-44fc-9414-0fd92c41a114" path="/var/lib/kubelet/pods/f491adde-145d-44fc-9414-0fd92c41a114/volumes" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.283332 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d691b652-0077-4709-9e9d-16b87c8d3d3c","Type":"ContainerStarted","Data":"5e167b1036f951cd0167c7e6fa2feda24d5dd87ebdf0206166a3802c96ce12f9"} Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.283971 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.286064 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" event={"ID":"033dbc66-0baa-46b3-8fda-3881303e4e40","Type":"ContainerStarted","Data":"a4110fcf56688524b599e752b9185dc87cb5234e1202ed239766375872ab4d4d"} Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.286277 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.288825 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" event={"ID":"a7cb8c3d-2157-4c52-a196-24d514b098ee","Type":"ContainerStarted","Data":"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680"} Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.288995 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.316907 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.71989495 podStartE2EDuration="23.316870465s" podCreationTimestamp="2026-01-30 08:26:09 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.669896282 +0000 UTC m=+1019.365443391" lastFinishedPulling="2026-01-30 08:26:29.266871747 +0000 UTC m=+1027.962418906" observedRunningTime="2026-01-30 08:26:32.303411716 +0000 UTC m=+1030.998958835" watchObservedRunningTime="2026-01-30 08:26:32.316870465 +0000 UTC m=+1031.012417584" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.332603 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" podStartSLOduration=27.25844405 podStartE2EDuration="28.332587324s" podCreationTimestamp="2026-01-30 08:26:04 +0000 UTC" firstStartedPulling="2026-01-30 08:26:19.142017419 +0000 UTC m=+1017.837564538" lastFinishedPulling="2026-01-30 08:26:20.216160693 +0000 UTC m=+1018.911707812" observedRunningTime="2026-01-30 08:26:32.3257053 +0000 UTC m=+1031.021252409" watchObservedRunningTime="2026-01-30 08:26:32.332587324 +0000 UTC m=+1031.028134443" Jan 30 08:26:32 crc kubenswrapper[4870]: I0130 08:26:32.360612 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" podStartSLOduration=28.360580305 podStartE2EDuration="28.360580305s" podCreationTimestamp="2026-01-30 08:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:26:32.353266328 +0000 UTC m=+1031.048813427" watchObservedRunningTime="2026-01-30 08:26:32.360580305 +0000 UTC m=+1031.056127724" Jan 30 08:26:33 crc kubenswrapper[4870]: I0130 08:26:33.300218 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gznh8" event={"ID":"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2","Type":"ContainerStarted","Data":"ef7c5e94c6e0e8c94d28032137e089b103eebf493c58abbb1a5ebd1b4dd0bb24"} Jan 30 08:26:33 crc kubenswrapper[4870]: I0130 08:26:33.303035 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"625f2d84-6699-4e9f-881e-e96509760e9d","Type":"ContainerStarted","Data":"a248299522e63e13e8b74efca6008e7af89425ea5527bd7ce8be41f68e3c1636"} Jan 30 08:26:33 crc kubenswrapper[4870]: I0130 08:26:33.306093 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31607550-5ccc-4b0b-9fbd-18007a61dcff","Type":"ContainerStarted","Data":"0a51df6c2a8c835be83788e6c0e9cc99339b4ef2dc9fce8dc1f0609f8b094b25"} Jan 30 08:26:33 crc kubenswrapper[4870]: I0130 08:26:33.308470 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e9a5fd23-1240-4284-91cf-b57f4b2e3d02","Type":"ContainerStarted","Data":"83821d5d143761eee4d4af3ed223fbc1ab521a17e0b508f8b9ca0a3c17569a8a"} Jan 30 08:26:33 crc kubenswrapper[4870]: I0130 08:26:33.312028 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a","Type":"ContainerStarted","Data":"36c6b3f0e330f4c5764ceb8dd30c047b28952964612889bae6f27160bd91c81f"} Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.323146 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"2ab884a9-b47a-476a-8f89-140093b96527","Type":"ContainerStarted","Data":"1990bb623e12d14af684a0ba5a125e7077393acb0eeb246cda1b7953fb41a71d"} Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.325247 4870 generic.go:334] "Generic (PLEG): container finished" podID="b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2" containerID="ef7c5e94c6e0e8c94d28032137e089b103eebf493c58abbb1a5ebd1b4dd0bb24" exitCode=0 Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.325282 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gznh8" event={"ID":"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2","Type":"ContainerDied","Data":"ef7c5e94c6e0e8c94d28032137e089b103eebf493c58abbb1a5ebd1b4dd0bb24"} Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.329824 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67","Type":"ContainerStarted","Data":"9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172"} Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.329941 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.332220 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz" event={"ID":"496b707b-8de6-4228-b4fd-a48f3709586c","Type":"ContainerStarted","Data":"8424b8b1a4215db2df55f4ae408c3652cafc92ccbffbd1933bcf3dc11b2b4320"} Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.384960 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rwchz" podStartSLOduration=10.345152088 podStartE2EDuration="20.384943708s" podCreationTimestamp="2026-01-30 08:26:14 +0000 UTC" firstStartedPulling="2026-01-30 08:26:21.225283365 +0000 UTC m=+1019.920830474" lastFinishedPulling="2026-01-30 08:26:31.265074985 +0000 UTC m=+1029.960622094" observedRunningTime="2026-01-30 08:26:34.377535647 +0000 UTC m=+1033.073082766" watchObservedRunningTime="2026-01-30 08:26:34.384943708 +0000 UTC m=+1033.080490827" Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.393473 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.979362329 podStartE2EDuration="24.393451212s" podCreationTimestamp="2026-01-30 08:26:10 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.6691696 +0000 UTC m=+1019.364716709" lastFinishedPulling="2026-01-30 08:26:33.083258483 +0000 UTC m=+1031.778805592" observedRunningTime="2026-01-30 08:26:34.393421601 +0000 UTC m=+1033.088968710" watchObservedRunningTime="2026-01-30 08:26:34.393451212 +0000 UTC m=+1033.088998331" Jan 30 08:26:34 crc kubenswrapper[4870]: I0130 08:26:34.780841 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rwchz" Jan 30 08:26:35 crc kubenswrapper[4870]: I0130 08:26:35.339822 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerStarted","Data":"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49"} Jan 30 08:26:35 crc kubenswrapper[4870]: I0130 08:26:35.344581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerStarted","Data":"431cfd92f94e1fd13cdf200e4b8c59047ac3e311acf24702741a42c672002d0e"} Jan 30 08:26:35 crc kubenswrapper[4870]: I0130 08:26:35.346380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerStarted","Data":"55e6a4b3af15640088e3e1927ba88636a5cf35ec532fc2df3395e46ebcf07d79"} Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.357736 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gznh8" event={"ID":"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2","Type":"ContainerStarted","Data":"41df10a95d047b25ae7587f3ced3c928ce3c50893926dc89c0fd432d04195eba"} Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.358490 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gznh8" event={"ID":"b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2","Type":"ContainerStarted","Data":"1a859ee2531e71d18025357ed182dd3124bf293169ea74ac97fb7ddb2e9a18c9"} Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.359755 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"625f2d84-6699-4e9f-881e-e96509760e9d","Type":"ContainerStarted","Data":"9df60d1d62b2ce62252b4703486760ad04e0294a6b54e78f31b03d319f243872"} Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.362231 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e9a5fd23-1240-4284-91cf-b57f4b2e3d02","Type":"ContainerStarted","Data":"7c453dde5d362e21f3144c4b39f706dbcf9d1c3d8699d071ad0f7f13ba2c0d74"} Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.421653 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.109272934 podStartE2EDuration="19.421633894s" podCreationTimestamp="2026-01-30 08:26:17 +0000 UTC" firstStartedPulling="2026-01-30 08:26:24.536088698 +0000 UTC m=+1023.231635857" lastFinishedPulling="2026-01-30 08:26:35.848449698 +0000 UTC m=+1034.543996817" observedRunningTime="2026-01-30 08:26:36.419129056 +0000 UTC m=+1035.114676175" watchObservedRunningTime="2026-01-30 08:26:36.421633894 +0000 UTC m=+1035.117181003" Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.825968 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:36 crc kubenswrapper[4870]: I0130 08:26:36.870576 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:37 crc kubenswrapper[4870]: I0130 08:26:37.371583 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:37 crc kubenswrapper[4870]: I0130 08:26:37.402444 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gznh8" podStartSLOduration=15.63691538 podStartE2EDuration="23.402424833s" podCreationTimestamp="2026-01-30 08:26:14 +0000 UTC" firstStartedPulling="2026-01-30 08:26:23.141006606 +0000 UTC m=+1021.836553715" lastFinishedPulling="2026-01-30 08:26:30.906516019 +0000 UTC m=+1029.602063168" observedRunningTime="2026-01-30 08:26:37.3942814 +0000 UTC m=+1036.089828499" watchObservedRunningTime="2026-01-30 08:26:37.402424833 +0000 UTC m=+1036.097971942" Jan 30 08:26:37 crc kubenswrapper[4870]: I0130 08:26:37.424995 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.879714566 podStartE2EDuration="23.424979225s" podCreationTimestamp="2026-01-30 08:26:14 +0000 UTC" firstStartedPulling="2026-01-30 08:26:21.298044159 +0000 UTC m=+1019.993591268" lastFinishedPulling="2026-01-30 08:26:35.843308818 +0000 UTC m=+1034.538855927" observedRunningTime="2026-01-30 08:26:37.416981286 +0000 UTC m=+1036.112528405" watchObservedRunningTime="2026-01-30 08:26:37.424979225 +0000 UTC m=+1036.120526334" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.416798 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.679229 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.679698 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="dnsmasq-dns" containerID="cri-o://7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680" gracePeriod=10 Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.689145 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.711985 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:38 crc kubenswrapper[4870]: E0130 08:26:38.712416 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f491adde-145d-44fc-9414-0fd92c41a114" containerName="init" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.712436 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f491adde-145d-44fc-9414-0fd92c41a114" containerName="init" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.712650 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f491adde-145d-44fc-9414-0fd92c41a114" containerName="init" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.713764 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.717114 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.727551 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.855808 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-56vf8"] Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.857710 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.860160 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.860277 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz569\" (UniqueName: \"kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.860311 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.860347 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.860427 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-56vf8"] Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.864846 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962681 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovn-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962780 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz569\" (UniqueName: \"kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962839 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962890 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4rd\" (UniqueName: \"kubernetes.io/projected/eaa9048d-8c54-4054-87d1-69c6746c1479-kube-api-access-xc4rd\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962938 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.962969 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.963020 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.963050 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovs-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.963076 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-combined-ca-bundle\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.963110 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa9048d-8c54-4054-87d1-69c6746c1479-config\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.964500 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.964500 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.964766 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:38 crc kubenswrapper[4870]: I0130 08:26:38.991581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz569\" (UniqueName: \"kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569\") pod \"dnsmasq-dns-57b46657c9-lg2gz\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.018998 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.019452 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="dnsmasq-dns" containerID="cri-o://a4110fcf56688524b599e752b9185dc87cb5234e1202ed239766375872ab4d4d" gracePeriod=10 Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.028709 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.035344 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.036586 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.042898 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.054298 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065055 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovn-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065129 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4rd\" (UniqueName: \"kubernetes.io/projected/eaa9048d-8c54-4054-87d1-69c6746c1479-kube-api-access-xc4rd\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065172 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065212 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovs-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065232 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-combined-ca-bundle\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065254 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa9048d-8c54-4054-87d1-69c6746c1479-config\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065418 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovn-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065868 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eaa9048d-8c54-4054-87d1-69c6746c1479-ovs-rundir\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.065985 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaa9048d-8c54-4054-87d1-69c6746c1479-config\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.081337 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.084599 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.085675 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4rd\" (UniqueName: \"kubernetes.io/projected/eaa9048d-8c54-4054-87d1-69c6746c1479-kube-api-access-xc4rd\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.107586 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa9048d-8c54-4054-87d1-69c6746c1479-combined-ca-bundle\") pod \"ovn-controller-metrics-56vf8\" (UID: \"eaa9048d-8c54-4054-87d1-69c6746c1479\") " pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.169052 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.169138 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.169162 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smqm6\" (UniqueName: \"kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.169217 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.169246 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.182981 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-56vf8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.250891 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.272342 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.272839 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.272885 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smqm6\" (UniqueName: \"kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.272930 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.272960 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.273959 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.274532 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.275922 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.276061 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.293252 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smqm6\" (UniqueName: \"kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6\") pod \"dnsmasq-dns-5899d7d557-qxdtt\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.359984 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.371465 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.374005 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc\") pod \"a7cb8c3d-2157-4c52-a196-24d514b098ee\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.374191 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config\") pod \"a7cb8c3d-2157-4c52-a196-24d514b098ee\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.374340 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pkdd\" (UniqueName: \"kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd\") pod \"a7cb8c3d-2157-4c52-a196-24d514b098ee\" (UID: \"a7cb8c3d-2157-4c52-a196-24d514b098ee\") " Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.386101 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd" (OuterVolumeSpecName: "kube-api-access-6pkdd") pod "a7cb8c3d-2157-4c52-a196-24d514b098ee" (UID: "a7cb8c3d-2157-4c52-a196-24d514b098ee"). InnerVolumeSpecName "kube-api-access-6pkdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.400480 4870 generic.go:334] "Generic (PLEG): container finished" podID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerID="a4110fcf56688524b599e752b9185dc87cb5234e1202ed239766375872ab4d4d" exitCode=0 Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.400550 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" event={"ID":"033dbc66-0baa-46b3-8fda-3881303e4e40","Type":"ContainerDied","Data":"a4110fcf56688524b599e752b9185dc87cb5234e1202ed239766375872ab4d4d"} Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.402530 4870 generic.go:334] "Generic (PLEG): container finished" podID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerID="7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680" exitCode=0 Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.403497 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.403979 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" event={"ID":"a7cb8c3d-2157-4c52-a196-24d514b098ee","Type":"ContainerDied","Data":"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680"} Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.404063 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57467f675c-j7lcp" event={"ID":"a7cb8c3d-2157-4c52-a196-24d514b098ee","Type":"ContainerDied","Data":"166a34a595f3784fb50b62bea8c6b3aecbe48d2fb8d288bfed3951f4b3b662b9"} Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.404160 4870 scope.go:117] "RemoveContainer" containerID="7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.439066 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7cb8c3d-2157-4c52-a196-24d514b098ee" (UID: "a7cb8c3d-2157-4c52-a196-24d514b098ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.441295 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config" (OuterVolumeSpecName: "config") pod "a7cb8c3d-2157-4c52-a196-24d514b098ee" (UID: "a7cb8c3d-2157-4c52-a196-24d514b098ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.445370 4870 scope.go:117] "RemoveContainer" containerID="f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.465832 4870 scope.go:117] "RemoveContainer" containerID="7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680" Jan 30 08:26:39 crc kubenswrapper[4870]: E0130 08:26:39.469520 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680\": container with ID starting with 7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680 not found: ID does not exist" containerID="7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.469563 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680"} err="failed to get container status \"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680\": rpc error: code = NotFound desc = could not find container \"7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680\": container with ID starting with 7126b51eddb9f7175bdeb0403c83717f365ff87bb0487d366795485ba9910680 not found: ID does not exist" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.469592 4870 scope.go:117] "RemoveContainer" containerID="f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc" Jan 30 08:26:39 crc kubenswrapper[4870]: E0130 08:26:39.470110 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc\": container with ID starting with f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc not found: ID does not exist" containerID="f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.470141 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc"} err="failed to get container status \"f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc\": rpc error: code = NotFound desc = could not find container \"f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc\": container with ID starting with f169d9b6f52360d0a659533116b73054218a299b89ad6a74d7b9d17f3b0818bc not found: ID does not exist" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.477370 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pkdd\" (UniqueName: \"kubernetes.io/projected/a7cb8c3d-2157-4c52-a196-24d514b098ee-kube-api-access-6pkdd\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.477395 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.477404 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cb8c3d-2157-4c52-a196-24d514b098ee-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:39 crc kubenswrapper[4870]: W0130 08:26:39.650571 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d4c089_1da0_424e_9f99_008407498c84.slice/crio-b29df059ce6bb06a2e19ec34d7aab1136851e890c03a603fea3383c675972b5a WatchSource:0}: Error finding container b29df059ce6bb06a2e19ec34d7aab1136851e890c03a603fea3383c675972b5a: Status 404 returned error can't find the container with id b29df059ce6bb06a2e19ec34d7aab1136851e890c03a603fea3383c675972b5a Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.651777 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.686844 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.747018 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.752733 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.754639 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57467f675c-j7lcp"] Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.768330 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-56vf8"] Jan 30 08:26:39 crc kubenswrapper[4870]: W0130 08:26:39.780721 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa9048d_8c54_4054_87d1_69c6746c1479.slice/crio-0f77ed04ed539efc5772e6e82c1708b77d0487083193039b6ffd7c3e093cad9f WatchSource:0}: Error finding container 0f77ed04ed539efc5772e6e82c1708b77d0487083193039b6ffd7c3e093cad9f: Status 404 returned error can't find the container with id 0f77ed04ed539efc5772e6e82c1708b77d0487083193039b6ffd7c3e093cad9f Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.792982 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.793023 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:26:39 crc kubenswrapper[4870]: I0130 08:26:39.883796 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.083260 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" path="/var/lib/kubelet/pods/a7cb8c3d-2157-4c52-a196-24d514b098ee/volumes" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.173264 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.299690 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config\") pod \"033dbc66-0baa-46b3-8fda-3881303e4e40\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.299784 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd26p\" (UniqueName: \"kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p\") pod \"033dbc66-0baa-46b3-8fda-3881303e4e40\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.299855 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc\") pod \"033dbc66-0baa-46b3-8fda-3881303e4e40\" (UID: \"033dbc66-0baa-46b3-8fda-3881303e4e40\") " Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.305229 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p" (OuterVolumeSpecName: "kube-api-access-zd26p") pod "033dbc66-0baa-46b3-8fda-3881303e4e40" (UID: "033dbc66-0baa-46b3-8fda-3881303e4e40"). InnerVolumeSpecName "kube-api-access-zd26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.371839 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config" (OuterVolumeSpecName: "config") pod "033dbc66-0baa-46b3-8fda-3881303e4e40" (UID: "033dbc66-0baa-46b3-8fda-3881303e4e40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.376481 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "033dbc66-0baa-46b3-8fda-3881303e4e40" (UID: "033dbc66-0baa-46b3-8fda-3881303e4e40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.401698 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.401727 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd26p\" (UniqueName: \"kubernetes.io/projected/033dbc66-0baa-46b3-8fda-3881303e4e40-kube-api-access-zd26p\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.401738 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/033dbc66-0baa-46b3-8fda-3881303e4e40-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.412516 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" event={"ID":"c06e0509-685b-4010-9aef-1388bc28248d","Type":"ContainerStarted","Data":"88033e8d4c732a95cee343d71511e6c36bd6bfa8fe952134c2f0152ffc8ba5b1"} Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.413481 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-56vf8" event={"ID":"eaa9048d-8c54-4054-87d1-69c6746c1479","Type":"ContainerStarted","Data":"0f77ed04ed539efc5772e6e82c1708b77d0487083193039b6ffd7c3e093cad9f"} Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.414892 4870 generic.go:334] "Generic (PLEG): container finished" podID="a2d4c089-1da0-424e-9f99-008407498c84" containerID="59e1a6d7a27f28bf32baea3e14c93e926f69980f3fb15bd86a63262f4100c81d" exitCode=0 Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.414958 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" event={"ID":"a2d4c089-1da0-424e-9f99-008407498c84","Type":"ContainerDied","Data":"59e1a6d7a27f28bf32baea3e14c93e926f69980f3fb15bd86a63262f4100c81d"} Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.414984 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" event={"ID":"a2d4c089-1da0-424e-9f99-008407498c84","Type":"ContainerStarted","Data":"b29df059ce6bb06a2e19ec34d7aab1136851e890c03a603fea3383c675972b5a"} Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.417603 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.420411 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" event={"ID":"033dbc66-0baa-46b3-8fda-3881303e4e40","Type":"ContainerDied","Data":"be7d1aa3f3670be6d94e4a7facc9d6deaa6f49ecc11c230b4a15c8f46a3117b6"} Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.420480 4870 scope.go:117] "RemoveContainer" containerID="a4110fcf56688524b599e752b9185dc87cb5234e1202ed239766375872ab4d4d" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.424494 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.472283 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.483571 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d56d856cf-n69v7"] Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.748167 4870 scope.go:117] "RemoveContainer" containerID="fa2967a57d46eadef06dee8c0ac950d7fe80b8807314b92bceff366e74a2aaa8" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.780861 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.971833 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 08:26:40 crc kubenswrapper[4870]: E0130 08:26:40.972350 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972368 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: E0130 08:26:40.972421 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="init" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972428 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="init" Jan 30 08:26:40 crc kubenswrapper[4870]: E0130 08:26:40.972443 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972451 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: E0130 08:26:40.972464 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="init" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972471 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="init" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972665 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.972686 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cb8c3d-2157-4c52-a196-24d514b098ee" containerName="dnsmasq-dns" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.974063 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.983456 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.983494 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7z24b" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.983760 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.983904 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 08:26:40 crc kubenswrapper[4870]: I0130 08:26:40.996618 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.123184 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.136952 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137004 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137047 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137085 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsgpx\" (UniqueName: \"kubernetes.io/projected/d69aef12-ac48-41f7-8a14-a561edab0ae7-kube-api-access-bsgpx\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-config\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137151 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.137179 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-scripts\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.202851 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.236306 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.237597 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238211 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238243 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238279 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238331 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsgpx\" (UniqueName: \"kubernetes.io/projected/d69aef12-ac48-41f7-8a14-a561edab0ae7-kube-api-access-bsgpx\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238478 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-config\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.238790 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.239184 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.239296 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-scripts\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.239573 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-config\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.239967 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69aef12-ac48-41f7-8a14-a561edab0ae7-scripts\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.244035 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.245215 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.254500 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69aef12-ac48-41f7-8a14-a561edab0ae7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.257836 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.271824 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsgpx\" (UniqueName: \"kubernetes.io/projected/d69aef12-ac48-41f7-8a14-a561edab0ae7-kube-api-access-bsgpx\") pod \"ovn-northd-0\" (UID: \"d69aef12-ac48-41f7-8a14-a561edab0ae7\") " pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.333218 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.340642 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.340733 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8qz\" (UniqueName: \"kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.340786 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.340820 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.340835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.441924 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.442194 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.442217 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.442290 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.442338 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8qz\" (UniqueName: \"kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.443023 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.443295 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.447978 4870 generic.go:334] "Generic (PLEG): container finished" podID="c06e0509-685b-4010-9aef-1388bc28248d" containerID="38cec7542121e32b4b05300244ea351b4cc373c4ddccf22126a36f614eae49e4" exitCode=0 Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.448031 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" event={"ID":"c06e0509-685b-4010-9aef-1388bc28248d","Type":"ContainerDied","Data":"38cec7542121e32b4b05300244ea351b4cc373c4ddccf22126a36f614eae49e4"} Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.451210 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.458134 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-56vf8" event={"ID":"eaa9048d-8c54-4054-87d1-69c6746c1479","Type":"ContainerStarted","Data":"e05e8c7168dd6280ee42c06df96bc58337340363872a03081ffb828299776621"} Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.465185 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.467482 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8qz\" (UniqueName: \"kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz\") pod \"dnsmasq-dns-65cc6fcf45-r6b9m\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.467529 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" event={"ID":"a2d4c089-1da0-424e-9f99-008407498c84","Type":"ContainerStarted","Data":"08a90b3a0b430b6fe52c0e915f7af8b01113438944cfd79325217422f6e2a114"} Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.467560 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.501743 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-56vf8" podStartSLOduration=3.501704036 podStartE2EDuration="3.501704036s" podCreationTimestamp="2026-01-30 08:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:26:41.492551174 +0000 UTC m=+1040.188098283" watchObservedRunningTime="2026-01-30 08:26:41.501704036 +0000 UTC m=+1040.197251145" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.520744 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" podStartSLOduration=3.520726629 podStartE2EDuration="3.520726629s" podCreationTimestamp="2026-01-30 08:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:26:41.52009674 +0000 UTC m=+1040.215643849" watchObservedRunningTime="2026-01-30 08:26:41.520726629 +0000 UTC m=+1040.216273738" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.619283 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:41 crc kubenswrapper[4870]: E0130 08:26:41.668707 4870 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 08:26:41 crc kubenswrapper[4870]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c06e0509-685b-4010-9aef-1388bc28248d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 08:26:41 crc kubenswrapper[4870]: > podSandboxID="88033e8d4c732a95cee343d71511e6c36bd6bfa8fe952134c2f0152ffc8ba5b1" Jan 30 08:26:41 crc kubenswrapper[4870]: E0130 08:26:41.669234 4870 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 08:26:41 crc kubenswrapper[4870]: container &Container{Name:dnsmasq-dns,Image:38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7bh64fh67ch5c4h65bh587h67fh546h7bhc4h688h596h5c7h554h99h8h5dch586h7h5cbh686h55h64bh7dhdbhb6h575h65ch654h658h688h65bq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smqm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5899d7d557-qxdtt_openstack(c06e0509-685b-4010-9aef-1388bc28248d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c06e0509-685b-4010-9aef-1388bc28248d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 08:26:41 crc kubenswrapper[4870]: > logger="UnhandledError" Jan 30 08:26:41 crc kubenswrapper[4870]: E0130 08:26:41.670416 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c06e0509-685b-4010-9aef-1388bc28248d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" podUID="c06e0509-685b-4010-9aef-1388bc28248d" Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.817686 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 08:26:41 crc kubenswrapper[4870]: W0130 08:26:41.825865 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69aef12_ac48_41f7_8a14_a561edab0ae7.slice/crio-97e7f5b8cfce6c8c68f3754b85f177c27702b92922b9f413ad7d73de0c8a4624 WatchSource:0}: Error finding container 97e7f5b8cfce6c8c68f3754b85f177c27702b92922b9f413ad7d73de0c8a4624: Status 404 returned error can't find the container with id 97e7f5b8cfce6c8c68f3754b85f177c27702b92922b9f413ad7d73de0c8a4624 Jan 30 08:26:41 crc kubenswrapper[4870]: I0130 08:26:41.910844 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:26:41 crc kubenswrapper[4870]: W0130 08:26:41.948120 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd19c31_4252_4de7_a673_9da7aedcb785.slice/crio-374b146ad8265eb6041ff5f1143dd86432961e72a020212acd84189e8d8f2978 WatchSource:0}: Error finding container 374b146ad8265eb6041ff5f1143dd86432961e72a020212acd84189e8d8f2978: Status 404 returned error can't find the container with id 374b146ad8265eb6041ff5f1143dd86432961e72a020212acd84189e8d8f2978 Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.094018 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" path="/var/lib/kubelet/pods/033dbc66-0baa-46b3-8fda-3881303e4e40/volumes" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.350475 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.355668 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.361775 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.361792 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.362021 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.364071 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jpqmh" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.376544 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463411 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-kube-api-access-zrxkx\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463547 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463592 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-cache\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463625 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463706 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46634e41-7d5b-4181-b824-716bb37fca47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.463739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-lock\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.475634 4870 generic.go:334] "Generic (PLEG): container finished" podID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerID="431cfd92f94e1fd13cdf200e4b8c59047ac3e311acf24702741a42c672002d0e" exitCode=0 Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.475683 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerDied","Data":"431cfd92f94e1fd13cdf200e4b8c59047ac3e311acf24702741a42c672002d0e"} Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.482094 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d69aef12-ac48-41f7-8a14-a561edab0ae7","Type":"ContainerStarted","Data":"97e7f5b8cfce6c8c68f3754b85f177c27702b92922b9f413ad7d73de0c8a4624"} Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.485037 4870 generic.go:334] "Generic (PLEG): container finished" podID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerID="fa0e8e29630ac45ae5392bdda60293a38298eb7a8fb05baa4e216154fe19f932" exitCode=0 Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.485153 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" event={"ID":"8cd19c31-4252-4de7-a673-9da7aedcb785","Type":"ContainerDied","Data":"fa0e8e29630ac45ae5392bdda60293a38298eb7a8fb05baa4e216154fe19f932"} Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.485234 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" event={"ID":"8cd19c31-4252-4de7-a673-9da7aedcb785","Type":"ContainerStarted","Data":"374b146ad8265eb6041ff5f1143dd86432961e72a020212acd84189e8d8f2978"} Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.485965 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="dnsmasq-dns" containerID="cri-o://08a90b3a0b430b6fe52c0e915f7af8b01113438944cfd79325217422f6e2a114" gracePeriod=10 Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.565964 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46634e41-7d5b-4181-b824-716bb37fca47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.566016 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-lock\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.566149 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-kube-api-access-zrxkx\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.566216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.566280 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-cache\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.566320 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.568183 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: E0130 08:26:42.570426 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:42 crc kubenswrapper[4870]: E0130 08:26:42.570442 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:42 crc kubenswrapper[4870]: E0130 08:26:42.570493 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:26:43.070467153 +0000 UTC m=+1041.766014262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.572495 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-lock\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.572743 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/46634e41-7d5b-4181-b824-716bb37fca47-cache\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.580662 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46634e41-7d5b-4181-b824-716bb37fca47-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.588364 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxkx\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-kube-api-access-zrxkx\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.589565 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.875050 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dlnvg"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.876309 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.882612 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.882643 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.884055 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.914742 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dlnvg"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.921092 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-dlnvg"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.926934 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gkrl7"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.928039 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: E0130 08:26:42.930055 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-7qfqw ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-7qfqw ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-dlnvg" podUID="8ccf52cf-97d4-4b27-8305-24222e79cc73" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.947651 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gkrl7"] Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.973989 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974049 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974145 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974184 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974213 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974303 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974431 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974610 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974646 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974694 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.974776 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6z9k\" (UniqueName: \"kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:42 crc kubenswrapper[4870]: I0130 08:26:42.976094 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.077864 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.077938 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078006 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078033 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078066 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078094 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6z9k\" (UniqueName: \"kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078137 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078165 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078191 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078214 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078236 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078259 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078311 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.078394 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: E0130 08:26:43.078517 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:43 crc kubenswrapper[4870]: E0130 08:26:43.078550 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:43 crc kubenswrapper[4870]: E0130 08:26:43.078624 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:26:44.078601826 +0000 UTC m=+1042.774148945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.079703 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.079968 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.080653 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.081712 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.082791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.083725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.083792 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.084417 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.084490 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.085486 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.085553 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.089262 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.107066 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw\") pod \"swift-ring-rebalance-dlnvg\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.111177 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6z9k\" (UniqueName: \"kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k\") pod \"swift-ring-rebalance-gkrl7\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.247601 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.495724 4870 generic.go:334] "Generic (PLEG): container finished" podID="a2d4c089-1da0-424e-9f99-008407498c84" containerID="08a90b3a0b430b6fe52c0e915f7af8b01113438944cfd79325217422f6e2a114" exitCode=0 Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.496180 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.495925 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" event={"ID":"a2d4c089-1da0-424e-9f99-008407498c84","Type":"ContainerDied","Data":"08a90b3a0b430b6fe52c0e915f7af8b01113438944cfd79325217422f6e2a114"} Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.506628 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.584767 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.584818 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.584839 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.584867 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585010 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585064 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585084 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw\") pod \"8ccf52cf-97d4-4b27-8305-24222e79cc73\" (UID: \"8ccf52cf-97d4-4b27-8305-24222e79cc73\") " Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585405 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585751 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.585864 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts" (OuterVolumeSpecName: "scripts") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.589344 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.589759 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.589958 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw" (OuterVolumeSpecName: "kube-api-access-7qfqw") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "kube-api-access-7qfqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.590378 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8ccf52cf-97d4-4b27-8305-24222e79cc73" (UID: "8ccf52cf-97d4-4b27-8305-24222e79cc73"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687456 4870 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ccf52cf-97d4-4b27-8305-24222e79cc73-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687821 4870 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687838 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/8ccf52cf-97d4-4b27-8305-24222e79cc73-kube-api-access-7qfqw\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687858 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687900 4870 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687916 4870 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ccf52cf-97d4-4b27-8305-24222e79cc73-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.687941 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf52cf-97d4-4b27-8305-24222e79cc73-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:43 crc kubenswrapper[4870]: I0130 08:26:43.787586 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gkrl7"] Jan 30 08:26:43 crc kubenswrapper[4870]: W0130 08:26:43.983731 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4406e732_41a8_48a1_954a_6dbe4483a79a.slice/crio-c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d WatchSource:0}: Error finding container c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d: Status 404 returned error can't find the container with id c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.096951 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:44 crc kubenswrapper[4870]: E0130 08:26:44.097115 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:44 crc kubenswrapper[4870]: E0130 08:26:44.097127 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:44 crc kubenswrapper[4870]: E0130 08:26:44.097169 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:26:46.097155527 +0000 UTC m=+1044.792702636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.153159 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.198127 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz569\" (UniqueName: \"kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569\") pod \"a2d4c089-1da0-424e-9f99-008407498c84\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.198415 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc\") pod \"a2d4c089-1da0-424e-9f99-008407498c84\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.198484 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config\") pod \"a2d4c089-1da0-424e-9f99-008407498c84\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.198583 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb\") pod \"a2d4c089-1da0-424e-9f99-008407498c84\" (UID: \"a2d4c089-1da0-424e-9f99-008407498c84\") " Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.203187 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569" (OuterVolumeSpecName: "kube-api-access-sz569") pod "a2d4c089-1da0-424e-9f99-008407498c84" (UID: "a2d4c089-1da0-424e-9f99-008407498c84"). InnerVolumeSpecName "kube-api-access-sz569". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.245061 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2d4c089-1da0-424e-9f99-008407498c84" (UID: "a2d4c089-1da0-424e-9f99-008407498c84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.250431 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config" (OuterVolumeSpecName: "config") pod "a2d4c089-1da0-424e-9f99-008407498c84" (UID: "a2d4c089-1da0-424e-9f99-008407498c84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.254611 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2d4c089-1da0-424e-9f99-008407498c84" (UID: "a2d4c089-1da0-424e-9f99-008407498c84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.300580 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz569\" (UniqueName: \"kubernetes.io/projected/a2d4c089-1da0-424e-9f99-008407498c84-kube-api-access-sz569\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.300611 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.300620 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.300629 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d4c089-1da0-424e-9f99-008407498c84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.506523 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d69aef12-ac48-41f7-8a14-a561edab0ae7","Type":"ContainerStarted","Data":"91aa8b7933eda498160fe720c4f474e676fcad8053d1ffc08ce9de0eddca82b8"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.506564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d69aef12-ac48-41f7-8a14-a561edab0ae7","Type":"ContainerStarted","Data":"e50e54344c8e8b8c4465f05fda6f13d5222f00a1171727f117c5308aa992b441"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.506693 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.511691 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" event={"ID":"c06e0509-685b-4010-9aef-1388bc28248d","Type":"ContainerStarted","Data":"30dfb255982a92013d72993cf922c1d38121fed24accaafc65f742335785a2ce"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.512522 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.515348 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" event={"ID":"8cd19c31-4252-4de7-a673-9da7aedcb785","Type":"ContainerStarted","Data":"137ef8da742a762887455130866543407aab4e626fc693e72bbf0ba327725c4f"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.517473 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.519617 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" event={"ID":"a2d4c089-1da0-424e-9f99-008407498c84","Type":"ContainerDied","Data":"b29df059ce6bb06a2e19ec34d7aab1136851e890c03a603fea3383c675972b5a"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.519674 4870 scope.go:117] "RemoveContainer" containerID="08a90b3a0b430b6fe52c0e915f7af8b01113438944cfd79325217422f6e2a114" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.519787 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b46657c9-lg2gz" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.527346 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dlnvg" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.527358 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkrl7" event={"ID":"4406e732-41a8-48a1-954a-6dbe4483a79a","Type":"ContainerStarted","Data":"c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d"} Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.538908 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.33422093 podStartE2EDuration="4.538888023s" podCreationTimestamp="2026-01-30 08:26:40 +0000 UTC" firstStartedPulling="2026-01-30 08:26:41.829444788 +0000 UTC m=+1040.524991897" lastFinishedPulling="2026-01-30 08:26:44.034111881 +0000 UTC m=+1042.729658990" observedRunningTime="2026-01-30 08:26:44.525159307 +0000 UTC m=+1043.220706416" watchObservedRunningTime="2026-01-30 08:26:44.538888023 +0000 UTC m=+1043.234435132" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.545207 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" podStartSLOduration=5.5451912100000005 podStartE2EDuration="5.54519121s" podCreationTimestamp="2026-01-30 08:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:26:44.542973795 +0000 UTC m=+1043.238520924" watchObservedRunningTime="2026-01-30 08:26:44.54519121 +0000 UTC m=+1043.240738319" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.565562 4870 scope.go:117] "RemoveContainer" containerID="59e1a6d7a27f28bf32baea3e14c93e926f69980f3fb15bd86a63262f4100c81d" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.573689 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" podStartSLOduration=3.573659923 podStartE2EDuration="3.573659923s" podCreationTimestamp="2026-01-30 08:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:26:44.572437977 +0000 UTC m=+1043.267985126" watchObservedRunningTime="2026-01-30 08:26:44.573659923 +0000 UTC m=+1043.269207032" Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.635571 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-dlnvg"] Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.642441 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-dlnvg"] Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.649987 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.657065 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b46657c9-lg2gz"] Jan 30 08:26:44 crc kubenswrapper[4870]: I0130 08:26:44.768442 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d56d856cf-n69v7" podUID="033dbc66-0baa-46b3-8fda-3881303e4e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.103:5353: i/o timeout" Jan 30 08:26:46 crc kubenswrapper[4870]: I0130 08:26:46.087001 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccf52cf-97d4-4b27-8305-24222e79cc73" path="/var/lib/kubelet/pods/8ccf52cf-97d4-4b27-8305-24222e79cc73/volumes" Jan 30 08:26:46 crc kubenswrapper[4870]: I0130 08:26:46.088733 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d4c089-1da0-424e-9f99-008407498c84" path="/var/lib/kubelet/pods/a2d4c089-1da0-424e-9f99-008407498c84/volumes" Jan 30 08:26:46 crc kubenswrapper[4870]: I0130 08:26:46.150571 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:46 crc kubenswrapper[4870]: E0130 08:26:46.150727 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:46 crc kubenswrapper[4870]: E0130 08:26:46.150741 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:46 crc kubenswrapper[4870]: E0130 08:26:46.150786 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:26:50.150773789 +0000 UTC m=+1048.846320898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.553486 4870 generic.go:334] "Generic (PLEG): container finished" podID="31607550-5ccc-4b0b-9fbd-18007a61dcff" containerID="0a51df6c2a8c835be83788e6c0e9cc99339b4ef2dc9fce8dc1f0609f8b094b25" exitCode=0 Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.553595 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31607550-5ccc-4b0b-9fbd-18007a61dcff","Type":"ContainerDied","Data":"0a51df6c2a8c835be83788e6c0e9cc99339b4ef2dc9fce8dc1f0609f8b094b25"} Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.555907 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkrl7" event={"ID":"4406e732-41a8-48a1-954a-6dbe4483a79a","Type":"ContainerStarted","Data":"eda5abff9e7bbd3cf114a7856edb58fc9717ac9b1210df0aa05845e17e8d856e"} Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.564349 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a" containerID="36c6b3f0e330f4c5764ceb8dd30c047b28952964612889bae6f27160bd91c81f" exitCode=0 Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.564420 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a","Type":"ContainerDied","Data":"36c6b3f0e330f4c5764ceb8dd30c047b28952964612889bae6f27160bd91c81f"} Jan 30 08:26:47 crc kubenswrapper[4870]: I0130 08:26:47.623082 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gkrl7" podStartSLOduration=3.11777077 podStartE2EDuration="5.623065272s" podCreationTimestamp="2026-01-30 08:26:42 +0000 UTC" firstStartedPulling="2026-01-30 08:26:43.98509933 +0000 UTC m=+1042.680646439" lastFinishedPulling="2026-01-30 08:26:46.490393832 +0000 UTC m=+1045.185940941" observedRunningTime="2026-01-30 08:26:47.621487846 +0000 UTC m=+1046.317034955" watchObservedRunningTime="2026-01-30 08:26:47.623065272 +0000 UTC m=+1046.318612391" Jan 30 08:26:49 crc kubenswrapper[4870]: I0130 08:26:49.373159 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:50 crc kubenswrapper[4870]: I0130 08:26:50.232058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:50 crc kubenswrapper[4870]: E0130 08:26:50.232393 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:50 crc kubenswrapper[4870]: E0130 08:26:50.232434 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:50 crc kubenswrapper[4870]: E0130 08:26:50.232532 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:26:58.232505038 +0000 UTC m=+1056.928052187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:51 crc kubenswrapper[4870]: I0130 08:26:51.621336 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:26:51 crc kubenswrapper[4870]: I0130 08:26:51.720076 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:51 crc kubenswrapper[4870]: I0130 08:26:51.720331 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="dnsmasq-dns" containerID="cri-o://30dfb255982a92013d72993cf922c1d38121fed24accaafc65f742335785a2ce" gracePeriod=10 Jan 30 08:26:54 crc kubenswrapper[4870]: I0130 08:26:54.372073 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Jan 30 08:26:54 crc kubenswrapper[4870]: I0130 08:26:54.643906 4870 generic.go:334] "Generic (PLEG): container finished" podID="c06e0509-685b-4010-9aef-1388bc28248d" containerID="30dfb255982a92013d72993cf922c1d38121fed24accaafc65f742335785a2ce" exitCode=0 Jan 30 08:26:54 crc kubenswrapper[4870]: I0130 08:26:54.643930 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" event={"ID":"c06e0509-685b-4010-9aef-1388bc28248d","Type":"ContainerDied","Data":"30dfb255982a92013d72993cf922c1d38121fed24accaafc65f742335785a2ce"} Jan 30 08:26:54 crc kubenswrapper[4870]: I0130 08:26:54.995415 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.025975 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smqm6\" (UniqueName: \"kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6\") pod \"c06e0509-685b-4010-9aef-1388bc28248d\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.026112 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb\") pod \"c06e0509-685b-4010-9aef-1388bc28248d\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.026279 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb\") pod \"c06e0509-685b-4010-9aef-1388bc28248d\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.027763 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config\") pod \"c06e0509-685b-4010-9aef-1388bc28248d\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.028491 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc\") pod \"c06e0509-685b-4010-9aef-1388bc28248d\" (UID: \"c06e0509-685b-4010-9aef-1388bc28248d\") " Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.034099 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6" (OuterVolumeSpecName: "kube-api-access-smqm6") pod "c06e0509-685b-4010-9aef-1388bc28248d" (UID: "c06e0509-685b-4010-9aef-1388bc28248d"). InnerVolumeSpecName "kube-api-access-smqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.077056 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c06e0509-685b-4010-9aef-1388bc28248d" (UID: "c06e0509-685b-4010-9aef-1388bc28248d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.083377 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c06e0509-685b-4010-9aef-1388bc28248d" (UID: "c06e0509-685b-4010-9aef-1388bc28248d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.087275 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config" (OuterVolumeSpecName: "config") pod "c06e0509-685b-4010-9aef-1388bc28248d" (UID: "c06e0509-685b-4010-9aef-1388bc28248d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.096231 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c06e0509-685b-4010-9aef-1388bc28248d" (UID: "c06e0509-685b-4010-9aef-1388bc28248d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.132338 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.132375 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.132385 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smqm6\" (UniqueName: \"kubernetes.io/projected/c06e0509-685b-4010-9aef-1388bc28248d-kube-api-access-smqm6\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.132396 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.132406 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c06e0509-685b-4010-9aef-1388bc28248d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.657334 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" event={"ID":"c06e0509-685b-4010-9aef-1388bc28248d","Type":"ContainerDied","Data":"88033e8d4c732a95cee343d71511e6c36bd6bfa8fe952134c2f0152ffc8ba5b1"} Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.657413 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5899d7d557-qxdtt" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.657417 4870 scope.go:117] "RemoveContainer" containerID="30dfb255982a92013d72993cf922c1d38121fed24accaafc65f742335785a2ce" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.702809 4870 scope.go:117] "RemoveContainer" containerID="38cec7542121e32b4b05300244ea351b4cc373c4ddccf22126a36f614eae49e4" Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.711902 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:55 crc kubenswrapper[4870]: I0130 08:26:55.721193 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5899d7d557-qxdtt"] Jan 30 08:26:56 crc kubenswrapper[4870]: I0130 08:26:56.091626 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06e0509-685b-4010-9aef-1388bc28248d" path="/var/lib/kubelet/pods/c06e0509-685b-4010-9aef-1388bc28248d/volumes" Jan 30 08:26:56 crc kubenswrapper[4870]: I0130 08:26:56.673932 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a","Type":"ContainerStarted","Data":"588fbb7f8fee6acd1e8c15f77c43a3c289e7cd1a0f7662ff0f25b69677dcd399"} Jan 30 08:26:57 crc kubenswrapper[4870]: I0130 08:26:57.692371 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31607550-5ccc-4b0b-9fbd-18007a61dcff","Type":"ContainerStarted","Data":"e70d386e14055e09ce722da0b01bbdd85f97f4ce6f3a9e4ffe4a30dc7296472a"} Jan 30 08:26:57 crc kubenswrapper[4870]: I0130 08:26:57.705403 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerStarted","Data":"4c9a2cc96afb4697dcd9efa47cf237f34c2e2a0fb97e86a04bed4e71098a047b"} Jan 30 08:26:57 crc kubenswrapper[4870]: I0130 08:26:57.731872 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=40.771462729 podStartE2EDuration="50.73184977s" podCreationTimestamp="2026-01-30 08:26:07 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.681624907 +0000 UTC m=+1019.377172016" lastFinishedPulling="2026-01-30 08:26:30.642011938 +0000 UTC m=+1029.337559057" observedRunningTime="2026-01-30 08:26:57.72951205 +0000 UTC m=+1056.425059199" watchObservedRunningTime="2026-01-30 08:26:57.73184977 +0000 UTC m=+1056.427396919" Jan 30 08:26:57 crc kubenswrapper[4870]: I0130 08:26:57.767839 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=41.302867335 podStartE2EDuration="51.767807785s" podCreationTimestamp="2026-01-30 08:26:06 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.670191122 +0000 UTC m=+1019.365738221" lastFinishedPulling="2026-01-30 08:26:31.135131562 +0000 UTC m=+1029.830678671" observedRunningTime="2026-01-30 08:26:57.763078624 +0000 UTC m=+1056.458625803" watchObservedRunningTime="2026-01-30 08:26:57.767807785 +0000 UTC m=+1056.463354934" Jan 30 08:26:58 crc kubenswrapper[4870]: I0130 08:26:58.300930 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:26:58 crc kubenswrapper[4870]: E0130 08:26:58.301211 4870 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 08:26:58 crc kubenswrapper[4870]: E0130 08:26:58.301236 4870 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 08:26:58 crc kubenswrapper[4870]: E0130 08:26:58.301306 4870 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift podName:46634e41-7d5b-4181-b824-716bb37fca47 nodeName:}" failed. No retries permitted until 2026-01-30 08:27:14.301280746 +0000 UTC m=+1072.996827895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift") pod "swift-storage-0" (UID: "46634e41-7d5b-4181-b824-716bb37fca47") : configmap "swift-ring-files" not found Jan 30 08:26:59 crc kubenswrapper[4870]: I0130 08:26:59.111733 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 08:26:59 crc kubenswrapper[4870]: I0130 08:26:59.112537 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 08:27:00 crc kubenswrapper[4870]: I0130 08:27:00.735325 4870 generic.go:334] "Generic (PLEG): container finished" podID="4406e732-41a8-48a1-954a-6dbe4483a79a" containerID="eda5abff9e7bbd3cf114a7856edb58fc9717ac9b1210df0aa05845e17e8d856e" exitCode=0 Jan 30 08:27:00 crc kubenswrapper[4870]: I0130 08:27:00.735388 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkrl7" event={"ID":"4406e732-41a8-48a1-954a-6dbe4483a79a","Type":"ContainerDied","Data":"eda5abff9e7bbd3cf114a7856edb58fc9717ac9b1210df0aa05845e17e8d856e"} Jan 30 08:27:01 crc kubenswrapper[4870]: I0130 08:27:01.431957 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 08:27:01 crc kubenswrapper[4870]: I0130 08:27:01.750084 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerStarted","Data":"0e6b5e7cf5bb76c5d4597f6eef6bb065c51cbad9cc2aae78711fd5e59b7109c8"} Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.064215 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197662 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197740 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197803 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197824 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197853 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197905 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.197927 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6z9k\" (UniqueName: \"kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k\") pod \"4406e732-41a8-48a1-954a-6dbe4483a79a\" (UID: \"4406e732-41a8-48a1-954a-6dbe4483a79a\") " Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.201260 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.202130 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.226570 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k" (OuterVolumeSpecName: "kube-api-access-d6z9k") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "kube-api-access-d6z9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.231814 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.248310 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.280458 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts" (OuterVolumeSpecName: "scripts") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.284822 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4406e732-41a8-48a1-954a-6dbe4483a79a" (UID: "4406e732-41a8-48a1-954a-6dbe4483a79a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300363 4870 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300394 4870 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300404 4870 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4406e732-41a8-48a1-954a-6dbe4483a79a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300412 4870 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300422 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406e732-41a8-48a1-954a-6dbe4483a79a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300430 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4406e732-41a8-48a1-954a-6dbe4483a79a-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.300439 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6z9k\" (UniqueName: \"kubernetes.io/projected/4406e732-41a8-48a1-954a-6dbe4483a79a-kube-api-access-d6z9k\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.758863 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gkrl7" event={"ID":"4406e732-41a8-48a1-954a-6dbe4483a79a","Type":"ContainerDied","Data":"c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d"} Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.758921 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2c9f5c52e6901456bac3bbffbdb6a069ed3e92d4e7f13f97a8b660b6688ff9d" Jan 30 08:27:02 crc kubenswrapper[4870]: I0130 08:27:02.758941 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gkrl7" Jan 30 08:27:03 crc kubenswrapper[4870]: I0130 08:27:03.753526 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 08:27:03 crc kubenswrapper[4870]: I0130 08:27:03.918726 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 08:27:04 crc kubenswrapper[4870]: I0130 08:27:04.782587 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerStarted","Data":"49386d341f6cc7754611b6a6c194cd8140385a7b91dc049f19bb791c005d6a86"} Jan 30 08:27:04 crc kubenswrapper[4870]: I0130 08:27:04.819852 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.25183156 podStartE2EDuration="53.819828052s" podCreationTimestamp="2026-01-30 08:26:11 +0000 UTC" firstStartedPulling="2026-01-30 08:26:21.210237746 +0000 UTC m=+1019.905784855" lastFinishedPulling="2026-01-30 08:27:03.778234238 +0000 UTC m=+1062.473781347" observedRunningTime="2026-01-30 08:27:04.813092282 +0000 UTC m=+1063.508639471" watchObservedRunningTime="2026-01-30 08:27:04.819828052 +0000 UTC m=+1063.515375201" Jan 30 08:27:04 crc kubenswrapper[4870]: I0130 08:27:04.850653 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rwchz" podUID="496b707b-8de6-4228-b4fd-a48f3709586c" containerName="ovn-controller" probeResult="failure" output=< Jan 30 08:27:04 crc kubenswrapper[4870]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 08:27:04 crc kubenswrapper[4870]: > Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.543974 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.544030 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.614251 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.802013 4870 generic.go:334] "Generic (PLEG): container finished" podID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerID="55e6a4b3af15640088e3e1927ba88636a5cf35ec532fc2df3395e46ebcf07d79" exitCode=0 Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.802089 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerDied","Data":"55e6a4b3af15640088e3e1927ba88636a5cf35ec532fc2df3395e46ebcf07d79"} Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.803520 4870 generic.go:334] "Generic (PLEG): container finished" podID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerID="15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49" exitCode=0 Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.803585 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerDied","Data":"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49"} Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.805300 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ab884a9-b47a-476a-8f89-140093b96527" containerID="1990bb623e12d14af684a0ba5a125e7077393acb0eeb246cda1b7953fb41a71d" exitCode=0 Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.805351 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"2ab884a9-b47a-476a-8f89-140093b96527","Type":"ContainerDied","Data":"1990bb623e12d14af684a0ba5a125e7077393acb0eeb246cda1b7953fb41a71d"} Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886245 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7k8wj"] Jan 30 08:27:07 crc kubenswrapper[4870]: E0130 08:27:07.886597 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886613 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: E0130 08:27:07.886624 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4406e732-41a8-48a1-954a-6dbe4483a79a" containerName="swift-ring-rebalance" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886631 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4406e732-41a8-48a1-954a-6dbe4483a79a" containerName="swift-ring-rebalance" Jan 30 08:27:07 crc kubenswrapper[4870]: E0130 08:27:07.886655 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="init" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886661 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="init" Jan 30 08:27:07 crc kubenswrapper[4870]: E0130 08:27:07.886669 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="init" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886675 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="init" Jan 30 08:27:07 crc kubenswrapper[4870]: E0130 08:27:07.886687 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.886692 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.887007 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4406e732-41a8-48a1-954a-6dbe4483a79a" containerName="swift-ring-rebalance" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.887029 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06e0509-685b-4010-9aef-1388bc28248d" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.887037 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d4c089-1da0-424e-9f99-008407498c84" containerName="dnsmasq-dns" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.887567 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.892280 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7k8wj"] Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.898412 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 08:27:07 crc kubenswrapper[4870]: I0130 08:27:07.909582 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.024813 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.031859 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wdt\" (UniqueName: \"kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.032028 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.133618 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.133740 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wdt\" (UniqueName: \"kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.134429 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.157431 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wdt\" (UniqueName: \"kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt\") pod \"root-account-create-update-7k8wj\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.344218 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.816474 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"2ab884a9-b47a-476a-8f89-140093b96527","Type":"ContainerStarted","Data":"255f9bb78983d1770c7c49c251279626a4a8ad762e011e44910f271f2983b30d"} Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.818503 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.820608 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerStarted","Data":"2d90fc261d4a0e6355b34b516c467ce7b3ce867fbf835cf5614291d45b33a700"} Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.820901 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.822199 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerStarted","Data":"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f"} Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.823315 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.860886 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=55.477880993 podStartE2EDuration="1m4.860852635s" podCreationTimestamp="2026-01-30 08:26:04 +0000 UTC" firstStartedPulling="2026-01-30 08:26:21.259064826 +0000 UTC m=+1019.954611935" lastFinishedPulling="2026-01-30 08:26:30.642036468 +0000 UTC m=+1029.337583577" observedRunningTime="2026-01-30 08:27:08.853546789 +0000 UTC m=+1067.549093928" watchObservedRunningTime="2026-01-30 08:27:08.860852635 +0000 UTC m=+1067.556399744" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.862122 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7k8wj"] Jan 30 08:27:08 crc kubenswrapper[4870]: W0130 08:27:08.874259 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85bfb16d_8b6c_46e2_a7e3_0a5051aa66df.slice/crio-efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0 WatchSource:0}: Error finding container efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0: Status 404 returned error can't find the container with id efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0 Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.887926 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.30244599 podStartE2EDuration="1m4.887901966s" podCreationTimestamp="2026-01-30 08:26:04 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.68138337 +0000 UTC m=+1019.376930479" lastFinishedPulling="2026-01-30 08:26:29.266839316 +0000 UTC m=+1027.962386455" observedRunningTime="2026-01-30 08:27:08.877838428 +0000 UTC m=+1067.573385577" watchObservedRunningTime="2026-01-30 08:27:08.887901966 +0000 UTC m=+1067.583449105" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.917497 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.680512325 podStartE2EDuration="1m4.917474951s" podCreationTimestamp="2026-01-30 08:26:04 +0000 UTC" firstStartedPulling="2026-01-30 08:26:20.40499617 +0000 UTC m=+1019.100543279" lastFinishedPulling="2026-01-30 08:26:30.641958776 +0000 UTC m=+1029.337505905" observedRunningTime="2026-01-30 08:27:08.913227896 +0000 UTC m=+1067.608775005" watchObservedRunningTime="2026-01-30 08:27:08.917474951 +0000 UTC m=+1067.613022070" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.969840 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zdg4s"] Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.971093 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:08 crc kubenswrapper[4870]: I0130 08:27:08.998500 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zdg4s"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.063684 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68t2\" (UniqueName: \"kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.063803 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.073387 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-be9b-account-create-update-lgqm6"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.080013 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.082351 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.094428 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-be9b-account-create-update-lgqm6"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.165561 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68t2\" (UniqueName: \"kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.165718 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.167175 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.167240 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.167289 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5kjs\" (UniqueName: \"kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.189870 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68t2\" (UniqueName: \"kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2\") pod \"keystone-db-create-zdg4s\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.268950 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5kjs\" (UniqueName: \"kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.269456 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.270197 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.274171 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-772bw"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.275219 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.290121 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-772bw"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.295999 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5kjs\" (UniqueName: \"kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs\") pod \"keystone-be9b-account-create-update-lgqm6\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.308724 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.370714 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b24b-account-create-update-d2n4p"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.371392 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.371571 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cc7\" (UniqueName: \"kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.371691 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.374329 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.391567 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b24b-account-create-update-d2n4p"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.410089 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.485253 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98chn\" (UniqueName: \"kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.485342 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.485392 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cc7\" (UniqueName: \"kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.485445 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.486291 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.504355 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cc7\" (UniqueName: \"kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7\") pod \"placement-db-create-772bw\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.587248 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.587656 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98chn\" (UniqueName: \"kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.587898 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.589996 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-772bw" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.602414 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98chn\" (UniqueName: \"kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn\") pod \"placement-b24b-account-create-update-d2n4p\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.688349 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.793051 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zdg4s"] Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.832091 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rwchz" podUID="496b707b-8de6-4228-b4fd-a48f3709586c" containerName="ovn-controller" probeResult="failure" output=< Jan 30 08:27:09 crc kubenswrapper[4870]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 08:27:09 crc kubenswrapper[4870]: > Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.836303 4870 generic.go:334] "Generic (PLEG): container finished" podID="85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" containerID="7a7adac6f43dd00107198ca12f07a56a507d2b37982cb644a01747e8eb0b5b52" exitCode=0 Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.836365 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7k8wj" event={"ID":"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df","Type":"ContainerDied","Data":"7a7adac6f43dd00107198ca12f07a56a507d2b37982cb644a01747e8eb0b5b52"} Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.836393 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7k8wj" event={"ID":"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df","Type":"ContainerStarted","Data":"efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0"} Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.838946 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zdg4s" event={"ID":"5ac3a52d-4734-4be8-9530-6b7b535664f8","Type":"ContainerStarted","Data":"9d8f67b6ce532dd69259548889e2d099125d75a9ac7b7e4d54afd39958c000d2"} Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.852172 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.871950 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gznh8" Jan 30 08:27:09 crc kubenswrapper[4870]: I0130 08:27:09.946367 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-be9b-account-create-update-lgqm6"] Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.086599 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-772bw"] Jan 30 08:27:10 crc kubenswrapper[4870]: W0130 08:27:10.093053 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e990d4f_b684_47e6_8056_08cf765aa33d.slice/crio-714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3 WatchSource:0}: Error finding container 714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3: Status 404 returned error can't find the container with id 714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3 Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.107930 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rwchz-config-88kmx"] Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.109763 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.114791 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.132585 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz-config-88kmx"] Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.198510 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.198569 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.198749 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pkr\" (UniqueName: \"kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.199033 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.199174 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.199257 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: W0130 08:27:10.240350 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b66abfb_27d1_415e_abf2_2cb855a2bcaf.slice/crio-46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f WatchSource:0}: Error finding container 46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f: Status 404 returned error can't find the container with id 46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.243933 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b24b-account-create-update-d2n4p"] Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301071 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301139 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301200 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301227 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301253 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301284 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pkr\" (UniqueName: \"kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301535 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.301588 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.302359 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.303549 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.321144 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pkr\" (UniqueName: \"kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr\") pod \"ovn-controller-rwchz-config-88kmx\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.431782 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.846629 4870 generic.go:334] "Generic (PLEG): container finished" podID="5ac3a52d-4734-4be8-9530-6b7b535664f8" containerID="ae557205b83ba573012321c0b15a5b47277e108dca93d5acd055965c34b03da8" exitCode=0 Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.846703 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zdg4s" event={"ID":"5ac3a52d-4734-4be8-9530-6b7b535664f8","Type":"ContainerDied","Data":"ae557205b83ba573012321c0b15a5b47277e108dca93d5acd055965c34b03da8"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.847957 4870 generic.go:334] "Generic (PLEG): container finished" podID="93cd49cf-8353-49eb-89d2-2d3630503d9f" containerID="6dec8f4d9911b49219f94545d1dff11226dd491baa26e53a02289cf2ce287699" exitCode=0 Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.848033 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be9b-account-create-update-lgqm6" event={"ID":"93cd49cf-8353-49eb-89d2-2d3630503d9f","Type":"ContainerDied","Data":"6dec8f4d9911b49219f94545d1dff11226dd491baa26e53a02289cf2ce287699"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.848225 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be9b-account-create-update-lgqm6" event={"ID":"93cd49cf-8353-49eb-89d2-2d3630503d9f","Type":"ContainerStarted","Data":"0c370191997744f514fd76f73204f4cad83d46ceaa3ed748d4231b4f9e72df11"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.849342 4870 generic.go:334] "Generic (PLEG): container finished" podID="3b66abfb-27d1-415e-abf2-2cb855a2bcaf" containerID="78530e29e6f33fe9e6244539f845bfc30d3752986bcfd2b607b62cc6f7d5aab3" exitCode=0 Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.849404 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b24b-account-create-update-d2n4p" event={"ID":"3b66abfb-27d1-415e-abf2-2cb855a2bcaf","Type":"ContainerDied","Data":"78530e29e6f33fe9e6244539f845bfc30d3752986bcfd2b607b62cc6f7d5aab3"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.849430 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b24b-account-create-update-d2n4p" event={"ID":"3b66abfb-27d1-415e-abf2-2cb855a2bcaf","Type":"ContainerStarted","Data":"46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.850561 4870 generic.go:334] "Generic (PLEG): container finished" podID="9e990d4f-b684-47e6-8056-08cf765aa33d" containerID="7e7325618d20bdeab54c732bc7a397cb58a9db4a697a599a002533f4811bf8bd" exitCode=0 Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.850590 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-772bw" event={"ID":"9e990d4f-b684-47e6-8056-08cf765aa33d","Type":"ContainerDied","Data":"7e7325618d20bdeab54c732bc7a397cb58a9db4a697a599a002533f4811bf8bd"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.850614 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-772bw" event={"ID":"9e990d4f-b684-47e6-8056-08cf765aa33d","Type":"ContainerStarted","Data":"714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3"} Jan 30 08:27:10 crc kubenswrapper[4870]: I0130 08:27:10.958136 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz-config-88kmx"] Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.239977 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.322031 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2wdt\" (UniqueName: \"kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt\") pod \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.322235 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts\") pod \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\" (UID: \"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df\") " Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.323016 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" (UID: "85bfb16d-8b6c-46e2-a7e3-0a5051aa66df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.329910 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt" (OuterVolumeSpecName: "kube-api-access-m2wdt") pod "85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" (UID: "85bfb16d-8b6c-46e2-a7e3-0a5051aa66df"). InnerVolumeSpecName "kube-api-access-m2wdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.337202 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-x6s7d"] Jan 30 08:27:11 crc kubenswrapper[4870]: E0130 08:27:11.338090 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" containerName="mariadb-account-create-update" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.338128 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" containerName="mariadb-account-create-update" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.338379 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" containerName="mariadb-account-create-update" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.341247 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.347920 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-x6s7d"] Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.424515 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.424579 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jrcr\" (UniqueName: \"kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.424779 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.424794 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2wdt\" (UniqueName: \"kubernetes.io/projected/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df-kube-api-access-m2wdt\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.425649 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-a8a4-account-create-update-8gm2f"] Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.426639 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.431599 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.442460 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a8a4-account-create-update-8gm2f"] Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.526289 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.526378 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.526573 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxg9\" (UniqueName: \"kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.526624 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jrcr\" (UniqueName: \"kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.527145 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.545480 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jrcr\" (UniqueName: \"kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr\") pod \"watcher-db-create-x6s7d\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.628893 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.628991 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxg9\" (UniqueName: \"kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.629797 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.651607 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxg9\" (UniqueName: \"kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9\") pod \"watcher-a8a4-account-create-update-8gm2f\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.657715 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.744407 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.868338 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7k8wj" event={"ID":"85bfb16d-8b6c-46e2-a7e3-0a5051aa66df","Type":"ContainerDied","Data":"efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0"} Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.868378 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efac2cd7376b7e1b5a56aa00876b64c5d5188202a07cca8938cbd3daf70f1cc0" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.868402 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7k8wj" Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.876667 4870 generic.go:334] "Generic (PLEG): container finished" podID="377059c1-2286-4127-b4cc-d19ef6bac327" containerID="cd0dfeab70fb307cbb6535bdd2b5daa2556dc7c49a1bf88e90112f1cde7b135d" exitCode=0 Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.877477 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-88kmx" event={"ID":"377059c1-2286-4127-b4cc-d19ef6bac327","Type":"ContainerDied","Data":"cd0dfeab70fb307cbb6535bdd2b5daa2556dc7c49a1bf88e90112f1cde7b135d"} Jan 30 08:27:11 crc kubenswrapper[4870]: I0130 08:27:11.877500 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-88kmx" event={"ID":"377059c1-2286-4127-b4cc-d19ef6bac327","Type":"ContainerStarted","Data":"aa51696b6012176e76d2c25bdaddadb683836a8392f3031a0bab8fd34cf74ff2"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.104480 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-x6s7d"] Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.352783 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.362887 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.437304 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-772bw" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.452130 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts\") pod \"93cd49cf-8353-49eb-89d2-2d3630503d9f\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.452968 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93cd49cf-8353-49eb-89d2-2d3630503d9f" (UID: "93cd49cf-8353-49eb-89d2-2d3630503d9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.453152 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5kjs\" (UniqueName: \"kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs\") pod \"93cd49cf-8353-49eb-89d2-2d3630503d9f\" (UID: \"93cd49cf-8353-49eb-89d2-2d3630503d9f\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.454263 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cd49cf-8353-49eb-89d2-2d3630503d9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.455749 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.457992 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs" (OuterVolumeSpecName: "kube-api-access-x5kjs") pod "93cd49cf-8353-49eb-89d2-2d3630503d9f" (UID: "93cd49cf-8353-49eb-89d2-2d3630503d9f"). InnerVolumeSpecName "kube-api-access-x5kjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.527950 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a8a4-account-create-update-8gm2f"] Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555338 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts\") pod \"5ac3a52d-4734-4be8-9530-6b7b535664f8\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555388 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98chn\" (UniqueName: \"kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn\") pod \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555477 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts\") pod \"9e990d4f-b684-47e6-8056-08cf765aa33d\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555517 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r68t2\" (UniqueName: \"kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2\") pod \"5ac3a52d-4734-4be8-9530-6b7b535664f8\" (UID: \"5ac3a52d-4734-4be8-9530-6b7b535664f8\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555579 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4cc7\" (UniqueName: \"kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7\") pod \"9e990d4f-b684-47e6-8056-08cf765aa33d\" (UID: \"9e990d4f-b684-47e6-8056-08cf765aa33d\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555642 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts\") pod \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\" (UID: \"3b66abfb-27d1-415e-abf2-2cb855a2bcaf\") " Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555848 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ac3a52d-4734-4be8-9530-6b7b535664f8" (UID: "5ac3a52d-4734-4be8-9530-6b7b535664f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555931 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e990d4f-b684-47e6-8056-08cf765aa33d" (UID: "9e990d4f-b684-47e6-8056-08cf765aa33d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.555971 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5kjs\" (UniqueName: \"kubernetes.io/projected/93cd49cf-8353-49eb-89d2-2d3630503d9f-kube-api-access-x5kjs\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.556041 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac3a52d-4734-4be8-9530-6b7b535664f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.556324 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b66abfb-27d1-415e-abf2-2cb855a2bcaf" (UID: "3b66abfb-27d1-415e-abf2-2cb855a2bcaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.559417 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn" (OuterVolumeSpecName: "kube-api-access-98chn") pod "3b66abfb-27d1-415e-abf2-2cb855a2bcaf" (UID: "3b66abfb-27d1-415e-abf2-2cb855a2bcaf"). InnerVolumeSpecName "kube-api-access-98chn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.559551 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7" (OuterVolumeSpecName: "kube-api-access-m4cc7") pod "9e990d4f-b684-47e6-8056-08cf765aa33d" (UID: "9e990d4f-b684-47e6-8056-08cf765aa33d"). InnerVolumeSpecName "kube-api-access-m4cc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.559692 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2" (OuterVolumeSpecName: "kube-api-access-r68t2") pod "5ac3a52d-4734-4be8-9530-6b7b535664f8" (UID: "5ac3a52d-4734-4be8-9530-6b7b535664f8"). InnerVolumeSpecName "kube-api-access-r68t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.613604 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.617039 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.657341 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4cc7\" (UniqueName: \"kubernetes.io/projected/9e990d4f-b684-47e6-8056-08cf765aa33d-kube-api-access-m4cc7\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.657377 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.657392 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98chn\" (UniqueName: \"kubernetes.io/projected/3b66abfb-27d1-415e-abf2-2cb855a2bcaf-kube-api-access-98chn\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.657403 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e990d4f-b684-47e6-8056-08cf765aa33d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.657416 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r68t2\" (UniqueName: \"kubernetes.io/projected/5ac3a52d-4734-4be8-9530-6b7b535664f8-kube-api-access-r68t2\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.888953 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b24b-account-create-update-d2n4p" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.888948 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b24b-account-create-update-d2n4p" event={"ID":"3b66abfb-27d1-415e-abf2-2cb855a2bcaf","Type":"ContainerDied","Data":"46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.889182 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c4c020377146f5485f00dabefc993bf1e423c4a1be085848906b1eeb21803f" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.890677 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a8a4-account-create-update-8gm2f" event={"ID":"585a2047-d3db-4822-89b3-52fcd65d6e09","Type":"ContainerStarted","Data":"e32e8f9eb095a0767af1259a467ea84160f17bae2cb726e02486629d03a26d33"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.890713 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a8a4-account-create-update-8gm2f" event={"ID":"585a2047-d3db-4822-89b3-52fcd65d6e09","Type":"ContainerStarted","Data":"0045c9e1377a9668a3ada55443745aaffe10a93f434201f632113f32311c59ee"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.893310 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zdg4s" event={"ID":"5ac3a52d-4734-4be8-9530-6b7b535664f8","Type":"ContainerDied","Data":"9d8f67b6ce532dd69259548889e2d099125d75a9ac7b7e4d54afd39958c000d2"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.893334 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d8f67b6ce532dd69259548889e2d099125d75a9ac7b7e4d54afd39958c000d2" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.893377 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zdg4s" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.906396 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be9b-account-create-update-lgqm6" event={"ID":"93cd49cf-8353-49eb-89d2-2d3630503d9f","Type":"ContainerDied","Data":"0c370191997744f514fd76f73204f4cad83d46ceaa3ed748d4231b4f9e72df11"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.906435 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c370191997744f514fd76f73204f4cad83d46ceaa3ed748d4231b4f9e72df11" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.906495 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be9b-account-create-update-lgqm6" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.910780 4870 generic.go:334] "Generic (PLEG): container finished" podID="40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" containerID="5726ace895a9d7102cc621cf411a4327a47995798d8abdba29b293b762399c80" exitCode=0 Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.910897 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-x6s7d" event={"ID":"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3","Type":"ContainerDied","Data":"5726ace895a9d7102cc621cf411a4327a47995798d8abdba29b293b762399c80"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.910953 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-x6s7d" event={"ID":"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3","Type":"ContainerStarted","Data":"1f97d6a64f70d41e661a86d41b4c29baca98bfc6e559fe436d16cd50b549a65d"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.914076 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-772bw" event={"ID":"9e990d4f-b684-47e6-8056-08cf765aa33d","Type":"ContainerDied","Data":"714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3"} Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.914131 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="714d1725f1b75a23ac0a271ba8ee0cf966f9c24054a5cb144dabe40f11e9f5c3" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.914206 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-772bw" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.917126 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:12 crc kubenswrapper[4870]: I0130 08:27:12.938267 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-a8a4-account-create-update-8gm2f" podStartSLOduration=1.9382475160000001 podStartE2EDuration="1.938247516s" podCreationTimestamp="2026-01-30 08:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:12.934705841 +0000 UTC m=+1071.630252950" watchObservedRunningTime="2026-01-30 08:27:12.938247516 +0000 UTC m=+1071.633794625" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.351932 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52pkr\" (UniqueName: \"kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477753 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477805 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477827 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477942 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.477989 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run\") pod \"377059c1-2286-4127-b4cc-d19ef6bac327\" (UID: \"377059c1-2286-4127-b4cc-d19ef6bac327\") " Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.482089 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.482121 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.482842 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.483436 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts" (OuterVolumeSpecName: "scripts") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.486113 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run" (OuterVolumeSpecName: "var-run") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.508838 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr" (OuterVolumeSpecName: "kube-api-access-52pkr") pod "377059c1-2286-4127-b4cc-d19ef6bac327" (UID: "377059c1-2286-4127-b4cc-d19ef6bac327"). InnerVolumeSpecName "kube-api-access-52pkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581113 4870 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581143 4870 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581153 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52pkr\" (UniqueName: \"kubernetes.io/projected/377059c1-2286-4127-b4cc-d19ef6bac327-kube-api-access-52pkr\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581166 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581174 4870 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/377059c1-2286-4127-b4cc-d19ef6bac327-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.581183 4870 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/377059c1-2286-4127-b4cc-d19ef6bac327-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.923048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-88kmx" event={"ID":"377059c1-2286-4127-b4cc-d19ef6bac327","Type":"ContainerDied","Data":"aa51696b6012176e76d2c25bdaddadb683836a8392f3031a0bab8fd34cf74ff2"} Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.923958 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa51696b6012176e76d2c25bdaddadb683836a8392f3031a0bab8fd34cf74ff2" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.923154 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-88kmx" Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.924400 4870 generic.go:334] "Generic (PLEG): container finished" podID="585a2047-d3db-4822-89b3-52fcd65d6e09" containerID="e32e8f9eb095a0767af1259a467ea84160f17bae2cb726e02486629d03a26d33" exitCode=0 Jan 30 08:27:13 crc kubenswrapper[4870]: I0130 08:27:13.924865 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a8a4-account-create-update-8gm2f" event={"ID":"585a2047-d3db-4822-89b3-52fcd65d6e09","Type":"ContainerDied","Data":"e32e8f9eb095a0767af1259a467ea84160f17bae2cb726e02486629d03a26d33"} Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.297251 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.396126 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts\") pod \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.396335 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jrcr\" (UniqueName: \"kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr\") pod \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\" (UID: \"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3\") " Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.396790 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.402039 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr" (OuterVolumeSpecName: "kube-api-access-2jrcr") pod "40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" (UID: "40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3"). InnerVolumeSpecName "kube-api-access-2jrcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.402488 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" (UID: "40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.415206 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/46634e41-7d5b-4181-b824-716bb37fca47-etc-swift\") pod \"swift-storage-0\" (UID: \"46634e41-7d5b-4181-b824-716bb37fca47\") " pod="openstack/swift-storage-0" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.478260 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rwchz-config-88kmx"] Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.484734 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rwchz-config-88kmx"] Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.498174 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.498216 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jrcr\" (UniqueName: \"kubernetes.io/projected/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3-kube-api-access-2jrcr\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.532025 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.579540 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rwchz-config-9v2n7"] Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580140 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377059c1-2286-4127-b4cc-d19ef6bac327" containerName="ovn-config" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580239 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="377059c1-2286-4127-b4cc-d19ef6bac327" containerName="ovn-config" Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580256 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cd49cf-8353-49eb-89d2-2d3630503d9f" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580262 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cd49cf-8353-49eb-89d2-2d3630503d9f" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580273 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e990d4f-b684-47e6-8056-08cf765aa33d" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580280 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e990d4f-b684-47e6-8056-08cf765aa33d" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580322 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b66abfb-27d1-415e-abf2-2cb855a2bcaf" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580329 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b66abfb-27d1-415e-abf2-2cb855a2bcaf" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580343 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580350 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: E0130 08:27:14.580360 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac3a52d-4734-4be8-9530-6b7b535664f8" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580366 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac3a52d-4734-4be8-9530-6b7b535664f8" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580572 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cd49cf-8353-49eb-89d2-2d3630503d9f" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580586 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b66abfb-27d1-415e-abf2-2cb855a2bcaf" containerName="mariadb-account-create-update" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580599 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580611 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e990d4f-b684-47e6-8056-08cf765aa33d" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580622 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac3a52d-4734-4be8-9530-6b7b535664f8" containerName="mariadb-database-create" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.580664 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="377059c1-2286-4127-b4cc-d19ef6bac327" containerName="ovn-config" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.581444 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.584752 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.596665 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz-config-9v2n7"] Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701268 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701651 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701684 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701722 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701769 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.701815 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4c72\" (UniqueName: \"kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.802746 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.802851 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803132 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803188 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4c72\" (UniqueName: \"kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803256 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803289 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803422 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803639 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.803725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.804213 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.804850 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.824081 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4c72\" (UniqueName: \"kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72\") pod \"ovn-controller-rwchz-config-9v2n7\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.824124 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rwchz" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.901760 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.932809 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-x6s7d" Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.933071 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-x6s7d" event={"ID":"40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3","Type":"ContainerDied","Data":"1f97d6a64f70d41e661a86d41b4c29baca98bfc6e559fe436d16cd50b549a65d"} Jan 30 08:27:14 crc kubenswrapper[4870]: I0130 08:27:14.933115 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f97d6a64f70d41e661a86d41b4c29baca98bfc6e559fe436d16cd50b549a65d" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.155779 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.213793 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.312614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts\") pod \"585a2047-d3db-4822-89b3-52fcd65d6e09\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.312674 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zxg9\" (UniqueName: \"kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9\") pod \"585a2047-d3db-4822-89b3-52fcd65d6e09\" (UID: \"585a2047-d3db-4822-89b3-52fcd65d6e09\") " Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.313504 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "585a2047-d3db-4822-89b3-52fcd65d6e09" (UID: "585a2047-d3db-4822-89b3-52fcd65d6e09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.319060 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9" (OuterVolumeSpecName: "kube-api-access-8zxg9") pod "585a2047-d3db-4822-89b3-52fcd65d6e09" (UID: "585a2047-d3db-4822-89b3-52fcd65d6e09"). InnerVolumeSpecName "kube-api-access-8zxg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.414841 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585a2047-d3db-4822-89b3-52fcd65d6e09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.414907 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zxg9\" (UniqueName: \"kubernetes.io/projected/585a2047-d3db-4822-89b3-52fcd65d6e09-kube-api-access-8zxg9\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.431292 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rwchz-config-9v2n7"] Jan 30 08:27:15 crc kubenswrapper[4870]: W0130 08:27:15.585475 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c02c670_5f7d_4ee6_9072_e6e1ba2d6c61.slice/crio-158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d WatchSource:0}: Error finding container 158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d: Status 404 returned error can't find the container with id 158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.653472 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.653706 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="prometheus" containerID="cri-o://4c9a2cc96afb4697dcd9efa47cf237f34c2e2a0fb97e86a04bed4e71098a047b" gracePeriod=600 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.653800 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="thanos-sidecar" containerID="cri-o://49386d341f6cc7754611b6a6c194cd8140385a7b91dc049f19bb791c005d6a86" gracePeriod=600 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.653818 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="config-reloader" containerID="cri-o://0e6b5e7cf5bb76c5d4597f6eef6bb065c51cbad9cc2aae78711fd5e59b7109c8" gracePeriod=600 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.960299 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a8a4-account-create-update-8gm2f" event={"ID":"585a2047-d3db-4822-89b3-52fcd65d6e09","Type":"ContainerDied","Data":"0045c9e1377a9668a3ada55443745aaffe10a93f434201f632113f32311c59ee"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.960648 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0045c9e1377a9668a3ada55443745aaffe10a93f434201f632113f32311c59ee" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.960394 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a8a4-account-create-update-8gm2f" Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.963233 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"8f47ea6bf4d5583a27aaad61d251bd16077c3d6a9c987da87476d52e32cafb52"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.963279 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"21d3f193095c3bf008b3c6fbc5e88d15f5178aea214282e57db0b0c4bce0d86b"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.964898 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-9v2n7" event={"ID":"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61","Type":"ContainerStarted","Data":"d52e878ce9dd90e8dba444ebd6a2071ac79735b92b1f1220889d88eefcb18bc4"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.964928 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-9v2n7" event={"ID":"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61","Type":"ContainerStarted","Data":"158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970371 4870 generic.go:334] "Generic (PLEG): container finished" podID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerID="49386d341f6cc7754611b6a6c194cd8140385a7b91dc049f19bb791c005d6a86" exitCode=0 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970398 4870 generic.go:334] "Generic (PLEG): container finished" podID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerID="0e6b5e7cf5bb76c5d4597f6eef6bb065c51cbad9cc2aae78711fd5e59b7109c8" exitCode=0 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970407 4870 generic.go:334] "Generic (PLEG): container finished" podID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerID="4c9a2cc96afb4697dcd9efa47cf237f34c2e2a0fb97e86a04bed4e71098a047b" exitCode=0 Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970428 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerDied","Data":"49386d341f6cc7754611b6a6c194cd8140385a7b91dc049f19bb791c005d6a86"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970453 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerDied","Data":"0e6b5e7cf5bb76c5d4597f6eef6bb065c51cbad9cc2aae78711fd5e59b7109c8"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.970462 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerDied","Data":"4c9a2cc96afb4697dcd9efa47cf237f34c2e2a0fb97e86a04bed4e71098a047b"} Jan 30 08:27:15 crc kubenswrapper[4870]: I0130 08:27:15.983327 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rwchz-config-9v2n7" podStartSLOduration=1.983310267 podStartE2EDuration="1.983310267s" podCreationTimestamp="2026-01-30 08:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:15.979814603 +0000 UTC m=+1074.675361712" watchObservedRunningTime="2026-01-30 08:27:15.983310267 +0000 UTC m=+1074.678857376" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.083358 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377059c1-2286-4127-b4cc-d19ef6bac327" path="/var/lib/kubelet/pods/377059c1-2286-4127-b4cc-d19ef6bac327/volumes" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.277745 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7k8wj"] Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.282825 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7k8wj"] Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.364054 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cpgc6"] Jan 30 08:27:16 crc kubenswrapper[4870]: E0130 08:27:16.364373 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585a2047-d3db-4822-89b3-52fcd65d6e09" containerName="mariadb-account-create-update" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.364384 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="585a2047-d3db-4822-89b3-52fcd65d6e09" containerName="mariadb-account-create-update" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.364549 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="585a2047-d3db-4822-89b3-52fcd65d6e09" containerName="mariadb-account-create-update" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.365059 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.375195 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.379367 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cpgc6"] Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.547699 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.547900 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9t6n\" (UniqueName: \"kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.650618 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.650966 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9t6n\" (UniqueName: \"kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.651612 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.679739 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9t6n\" (UniqueName: \"kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n\") pod \"root-account-create-update-cpgc6\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.698405 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.758340 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954553 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954693 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954738 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z5p8\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954757 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954790 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954836 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954950 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.954984 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.955005 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.955037 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file\") pod \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\" (UID: \"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7\") " Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.956751 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.957103 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.957402 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.961000 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out" (OuterVolumeSpecName: "config-out") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.961609 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config" (OuterVolumeSpecName: "config") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.961697 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8" (OuterVolumeSpecName: "kube-api-access-9z5p8") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "kube-api-access-9z5p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.970983 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.973227 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.976524 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.983863 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config" (OuterVolumeSpecName: "web-config") pod "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" (UID: "0410b897-4bd8-48aa-a9fd-8213f6d9dbd7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.985132 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0410b897-4bd8-48aa-a9fd-8213f6d9dbd7","Type":"ContainerDied","Data":"709385d651d4bcb103b7d7d0e2928451ab8b488203130eb1f7baa5322860b0f5"} Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.985172 4870 scope.go:117] "RemoveContainer" containerID="49386d341f6cc7754611b6a6c194cd8140385a7b91dc049f19bb791c005d6a86" Jan 30 08:27:16 crc kubenswrapper[4870]: I0130 08:27:16.985291 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.000400 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"28b3f6582d57c735880cfcf204b8d97b68c5ebf4315506c7cc59ffa463f6d05b"} Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.000673 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"ee6cfe59aa2bf3c554bfc7c37484cbebc1397bdc28a2e25c180f98d5d0aa78d6"} Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.000687 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"2c5634890c50294f161a731d472116f0b41f085293d4cc7bc69c2394ef3a2ae1"} Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.002470 4870 generic.go:334] "Generic (PLEG): container finished" podID="0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" containerID="d52e878ce9dd90e8dba444ebd6a2071ac79735b92b1f1220889d88eefcb18bc4" exitCode=0 Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.002508 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-9v2n7" event={"ID":"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61","Type":"ContainerDied","Data":"d52e878ce9dd90e8dba444ebd6a2071ac79735b92b1f1220889d88eefcb18bc4"} Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.020048 4870 scope.go:117] "RemoveContainer" containerID="0e6b5e7cf5bb76c5d4597f6eef6bb065c51cbad9cc2aae78711fd5e59b7109c8" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.046512 4870 scope.go:117] "RemoveContainer" containerID="4c9a2cc96afb4697dcd9efa47cf237f34c2e2a0fb97e86a04bed4e71098a047b" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056694 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z5p8\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-kube-api-access-9z5p8\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056724 4870 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056735 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056745 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056774 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") on node \"crc\" " Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056786 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056795 4870 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056803 4870 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056816 4870 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.056826 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.063772 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.071135 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.086209 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cpgc6"] Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.091533 4870 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.091661 4870 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1") on node "crc" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.098041 4870 scope.go:117] "RemoveContainer" containerID="431cfd92f94e1fd13cdf200e4b8c59047ac3e311acf24702741a42c672002d0e" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.106586 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:17 crc kubenswrapper[4870]: E0130 08:27:17.106971 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="config-reloader" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.106990 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="config-reloader" Jan 30 08:27:17 crc kubenswrapper[4870]: E0130 08:27:17.107014 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="prometheus" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107022 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="prometheus" Jan 30 08:27:17 crc kubenswrapper[4870]: E0130 08:27:17.107036 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="thanos-sidecar" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107042 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="thanos-sidecar" Jan 30 08:27:17 crc kubenswrapper[4870]: E0130 08:27:17.107054 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="init-config-reloader" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107060 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="init-config-reloader" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107208 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="thanos-sidecar" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107220 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="config-reloader" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.107234 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" containerName="prometheus" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.108666 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.112975 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.113286 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.116560 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.116821 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.116827 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.117067 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.122906 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-88lql" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.123652 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.127616 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.139403 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.160632 4870 reconciler_common.go:293] "Volume detached for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262317 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262382 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262403 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262421 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262451 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5j9q\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262506 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262525 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262596 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262630 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262661 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.262697 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364066 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364119 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364140 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364159 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364180 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5j9q\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364200 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364222 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364239 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364260 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364283 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364316 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364346 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.364373 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.365524 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.366075 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.366100 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.367353 4870 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.367381 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b608408b27cf3925c08af2a9b3a133a2b5eb87db3a290a5641371b0533b7f7d2/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.369489 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.369683 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.369792 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.369819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.373544 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.374598 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.378063 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.382869 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5j9q\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.383381 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.406557 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.433317 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:17 crc kubenswrapper[4870]: I0130 08:27:17.824562 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:27:17 crc kubenswrapper[4870]: W0130 08:27:17.827559 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8b2056_4db2_489e_b1d1_b201e38e84c8.slice/crio-290c2383e9eae83628bb57bb648be794756981366804a5738c5a44985dd7ad40 WatchSource:0}: Error finding container 290c2383e9eae83628bb57bb648be794756981366804a5738c5a44985dd7ad40: Status 404 returned error can't find the container with id 290c2383e9eae83628bb57bb648be794756981366804a5738c5a44985dd7ad40 Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.021855 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerStarted","Data":"290c2383e9eae83628bb57bb648be794756981366804a5738c5a44985dd7ad40"} Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.023534 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpgc6" event={"ID":"8bc3ddf0-5fc8-4425-a434-1452753e1297","Type":"ContainerStarted","Data":"1e43a638833e8a28b17503377129992ef8df2c8dae8700c2567db5f0ab6b74f9"} Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.023555 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpgc6" event={"ID":"8bc3ddf0-5fc8-4425-a434-1452753e1297","Type":"ContainerStarted","Data":"78139496b9b71f0c64108fce20cdfec939241b19eab6e4d8770978ff18162ccc"} Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.032552 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"16d47837901223c83d13a881667ca19a8d3f0bba47dd7740ca29b2b563faec06"} Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.032602 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"5c323511750406d5c784857d3013f82787d96dd94be71912f4d4b34888391b38"} Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.043861 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-cpgc6" podStartSLOduration=2.043844274 podStartE2EDuration="2.043844274s" podCreationTimestamp="2026-01-30 08:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:18.043096391 +0000 UTC m=+1076.738643510" watchObservedRunningTime="2026-01-30 08:27:18.043844274 +0000 UTC m=+1076.739391383" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.085279 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0410b897-4bd8-48aa-a9fd-8213f6d9dbd7" path="/var/lib/kubelet/pods/0410b897-4bd8-48aa-a9fd-8213f6d9dbd7/volumes" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.086238 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bfb16d-8b6c-46e2-a7e3-0a5051aa66df" path="/var/lib/kubelet/pods/85bfb16d-8b6c-46e2-a7e3-0a5051aa66df/volumes" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.464910 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.584980 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585082 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585111 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4c72\" (UniqueName: \"kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585157 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585177 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585191 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn\") pod \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\" (UID: \"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61\") " Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585503 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585522 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run" (OuterVolumeSpecName: "var-run") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585536 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.585950 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.587174 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts" (OuterVolumeSpecName: "scripts") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.593081 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72" (OuterVolumeSpecName: "kube-api-access-v4c72") pod "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" (UID: "0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61"). InnerVolumeSpecName "kube-api-access-v4c72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686822 4870 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686852 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4c72\" (UniqueName: \"kubernetes.io/projected/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-kube-api-access-v4c72\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686865 4870 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686886 4870 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686896 4870 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:18 crc kubenswrapper[4870]: I0130 08:27:18.686906 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.043294 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rwchz-config-9v2n7" event={"ID":"0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61","Type":"ContainerDied","Data":"158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d"} Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.043342 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="158eec22a72dfc9360224d25636792800780cc5b6659cf69de830857fbfed11d" Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.043313 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rwchz-config-9v2n7" Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.044338 4870 generic.go:334] "Generic (PLEG): container finished" podID="8bc3ddf0-5fc8-4425-a434-1452753e1297" containerID="1e43a638833e8a28b17503377129992ef8df2c8dae8700c2567db5f0ab6b74f9" exitCode=0 Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.044383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpgc6" event={"ID":"8bc3ddf0-5fc8-4425-a434-1452753e1297","Type":"ContainerDied","Data":"1e43a638833e8a28b17503377129992ef8df2c8dae8700c2567db5f0ab6b74f9"} Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.046561 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"9e3e6fdad343180c47bac108c7b2e0f3a756149d88bf4074b84e4a94ce06e882"} Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.046594 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"e2f2e57265bf8c97c3bd6727882b68f4dff4210478291f0099551840b35dab45"} Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.570354 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rwchz-config-9v2n7"] Jan 30 08:27:19 crc kubenswrapper[4870]: I0130 08:27:19.613173 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rwchz-config-9v2n7"] Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.058540 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"3fd30c252a3edaca8bae848a280736aa4cf37abe10eda7b9788b10dc738b3491"} Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.058585 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"0dbfe1e27c44cb16dd97a3681009f1b6376ea16bdf4fcb49366b4993ce26db3b"} Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.058595 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"0cf8afbd61720cfc11d46949eb9f90c46dd3cf0e6fb98475f3adbccb26a3d908"} Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.058604 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"a5b93a8945cb3e931297968ea316ce014efc23a929aea9a0c6eef33d025b0f77"} Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.085648 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" path="/var/lib/kubelet/pods/0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61/volumes" Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.393501 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.537711 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts\") pod \"8bc3ddf0-5fc8-4425-a434-1452753e1297\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.537745 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9t6n\" (UniqueName: \"kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n\") pod \"8bc3ddf0-5fc8-4425-a434-1452753e1297\" (UID: \"8bc3ddf0-5fc8-4425-a434-1452753e1297\") " Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.538568 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bc3ddf0-5fc8-4425-a434-1452753e1297" (UID: "8bc3ddf0-5fc8-4425-a434-1452753e1297"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.552023 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n" (OuterVolumeSpecName: "kube-api-access-f9t6n") pod "8bc3ddf0-5fc8-4425-a434-1452753e1297" (UID: "8bc3ddf0-5fc8-4425-a434-1452753e1297"). InnerVolumeSpecName "kube-api-access-f9t6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.639198 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc3ddf0-5fc8-4425-a434-1452753e1297-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:20 crc kubenswrapper[4870]: I0130 08:27:20.639236 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9t6n\" (UniqueName: \"kubernetes.io/projected/8bc3ddf0-5fc8-4425-a434-1452753e1297-kube-api-access-f9t6n\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.066783 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerStarted","Data":"93e4e1345741b60dca904480e4327da8f596dec2a8d2178c87fe6d5632a2daeb"} Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.068924 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cpgc6" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.068867 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cpgc6" event={"ID":"8bc3ddf0-5fc8-4425-a434-1452753e1297","Type":"ContainerDied","Data":"78139496b9b71f0c64108fce20cdfec939241b19eab6e4d8770978ff18162ccc"} Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.069063 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78139496b9b71f0c64108fce20cdfec939241b19eab6e4d8770978ff18162ccc" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.075004 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"3c1d902d9b1767676c892638e464831c2bd80397f512d2b8f4c5f9e2d5490e79"} Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.075048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"32972c3d0b0dd45d3e677908a123951df694031ac3230dbeddd921b379482ec6"} Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.075077 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"46634e41-7d5b-4181-b824-716bb37fca47","Type":"ContainerStarted","Data":"ddf3bd3df8ecee80792fc5a8f5a73068eecb83f791313d0a09626487f5d05403"} Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.159609 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.366932356 podStartE2EDuration="40.159584057s" podCreationTimestamp="2026-01-30 08:26:41 +0000 UTC" firstStartedPulling="2026-01-30 08:27:15.166302482 +0000 UTC m=+1073.861849591" lastFinishedPulling="2026-01-30 08:27:18.958954183 +0000 UTC m=+1077.654501292" observedRunningTime="2026-01-30 08:27:21.151459357 +0000 UTC m=+1079.847006476" watchObservedRunningTime="2026-01-30 08:27:21.159584057 +0000 UTC m=+1079.855131166" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.426020 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:27:21 crc kubenswrapper[4870]: E0130 08:27:21.426382 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" containerName="ovn-config" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.426402 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" containerName="ovn-config" Jan 30 08:27:21 crc kubenswrapper[4870]: E0130 08:27:21.426423 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc3ddf0-5fc8-4425-a434-1452753e1297" containerName="mariadb-account-create-update" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.426432 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc3ddf0-5fc8-4425-a434-1452753e1297" containerName="mariadb-account-create-update" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.426624 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc3ddf0-5fc8-4425-a434-1452753e1297" containerName="mariadb-account-create-update" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.426655 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c02c670-5f7d-4ee6-9072-e6e1ba2d6c61" containerName="ovn-config" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.427567 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.429237 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.442789 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554466 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpz4\" (UniqueName: \"kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554567 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554585 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554604 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554634 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.554652 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.655916 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpz4\" (UniqueName: \"kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.656377 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.656404 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657271 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657826 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657307 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657899 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.657999 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.658530 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.658779 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.675332 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpz4\" (UniqueName: \"kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4\") pod \"dnsmasq-dns-757cc9679f-wq2nt\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:21 crc kubenswrapper[4870]: I0130 08:27:21.743948 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:22 crc kubenswrapper[4870]: I0130 08:27:22.199244 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:27:23 crc kubenswrapper[4870]: I0130 08:27:23.115712 4870 generic.go:334] "Generic (PLEG): container finished" podID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerID="0bc8a7090e4dbce528560fe9634da361240a61440f10f7e3c579fda9915e352a" exitCode=0 Jan 30 08:27:23 crc kubenswrapper[4870]: I0130 08:27:23.115996 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" event={"ID":"c409417e-6b71-491c-b7c5-bf1a2b63baed","Type":"ContainerDied","Data":"0bc8a7090e4dbce528560fe9634da361240a61440f10f7e3c579fda9915e352a"} Jan 30 08:27:23 crc kubenswrapper[4870]: I0130 08:27:23.116020 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" event={"ID":"c409417e-6b71-491c-b7c5-bf1a2b63baed","Type":"ContainerStarted","Data":"27e7324b5df73e3d0d4ead3bf3867e6cd08ef4e1f33f8795d331a5b682f586af"} Jan 30 08:27:24 crc kubenswrapper[4870]: I0130 08:27:24.125558 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" event={"ID":"c409417e-6b71-491c-b7c5-bf1a2b63baed","Type":"ContainerStarted","Data":"72ef45c7c24f8c5fc6788a5862241b333000ac3fd13dcb6350ad2553d12d13f8"} Jan 30 08:27:24 crc kubenswrapper[4870]: I0130 08:27:24.126024 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:24 crc kubenswrapper[4870]: I0130 08:27:24.151249 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" podStartSLOduration=3.151230128 podStartE2EDuration="3.151230128s" podCreationTimestamp="2026-01-30 08:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:24.142590502 +0000 UTC m=+1082.838137631" watchObservedRunningTime="2026-01-30 08:27:24.151230128 +0000 UTC m=+1082.846777257" Jan 30 08:27:25 crc kubenswrapper[4870]: I0130 08:27:25.514673 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Jan 30 08:27:25 crc kubenswrapper[4870]: I0130 08:27:25.870949 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Jan 30 08:27:26 crc kubenswrapper[4870]: I0130 08:27:26.230764 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="2ab884a9-b47a-476a-8f89-140093b96527" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 30 08:27:27 crc kubenswrapper[4870]: I0130 08:27:27.160441 4870 generic.go:334] "Generic (PLEG): container finished" podID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerID="93e4e1345741b60dca904480e4327da8f596dec2a8d2178c87fe6d5632a2daeb" exitCode=0 Jan 30 08:27:27 crc kubenswrapper[4870]: I0130 08:27:27.160499 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerDied","Data":"93e4e1345741b60dca904480e4327da8f596dec2a8d2178c87fe6d5632a2daeb"} Jan 30 08:27:28 crc kubenswrapper[4870]: I0130 08:27:28.170905 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerStarted","Data":"ea102a1406731d57700d5196e250072c2053fa2345212a3d6975e629610cb94c"} Jan 30 08:27:31 crc kubenswrapper[4870]: I0130 08:27:31.746715 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:27:31 crc kubenswrapper[4870]: I0130 08:27:31.936231 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:27:31 crc kubenswrapper[4870]: I0130 08:27:31.936500 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="dnsmasq-dns" containerID="cri-o://137ef8da742a762887455130866543407aab4e626fc693e72bbf0ba327725c4f" gracePeriod=10 Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.214776 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerStarted","Data":"d46fd5e887baec843bdd4f9f0254772bf4dc50323e052cd052dd2ea4657b7397"} Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.214954 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerStarted","Data":"a9347a512a592b79cb85be5a5a664bfadec21fed65bd7eacf1a97eb008166eb1"} Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.228354 4870 generic.go:334] "Generic (PLEG): container finished" podID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerID="137ef8da742a762887455130866543407aab4e626fc693e72bbf0ba327725c4f" exitCode=0 Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.228389 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" event={"ID":"8cd19c31-4252-4de7-a673-9da7aedcb785","Type":"ContainerDied","Data":"137ef8da742a762887455130866543407aab4e626fc693e72bbf0ba327725c4f"} Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.242159 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.24214519 podStartE2EDuration="15.24214519s" podCreationTimestamp="2026-01-30 08:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:32.240653295 +0000 UTC m=+1090.936200404" watchObservedRunningTime="2026-01-30 08:27:32.24214519 +0000 UTC m=+1090.937692299" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.373752 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.434214 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.434271 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.454329 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.468977 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb\") pod \"8cd19c31-4252-4de7-a673-9da7aedcb785\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.469029 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc\") pod \"8cd19c31-4252-4de7-a673-9da7aedcb785\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.469071 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb\") pod \"8cd19c31-4252-4de7-a673-9da7aedcb785\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.469206 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8qz\" (UniqueName: \"kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz\") pod \"8cd19c31-4252-4de7-a673-9da7aedcb785\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.469271 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config\") pod \"8cd19c31-4252-4de7-a673-9da7aedcb785\" (UID: \"8cd19c31-4252-4de7-a673-9da7aedcb785\") " Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.492600 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz" (OuterVolumeSpecName: "kube-api-access-cx8qz") pod "8cd19c31-4252-4de7-a673-9da7aedcb785" (UID: "8cd19c31-4252-4de7-a673-9da7aedcb785"). InnerVolumeSpecName "kube-api-access-cx8qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.524677 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8cd19c31-4252-4de7-a673-9da7aedcb785" (UID: "8cd19c31-4252-4de7-a673-9da7aedcb785"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.535343 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8cd19c31-4252-4de7-a673-9da7aedcb785" (UID: "8cd19c31-4252-4de7-a673-9da7aedcb785"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.537855 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config" (OuterVolumeSpecName: "config") pod "8cd19c31-4252-4de7-a673-9da7aedcb785" (UID: "8cd19c31-4252-4de7-a673-9da7aedcb785"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.547151 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8cd19c31-4252-4de7-a673-9da7aedcb785" (UID: "8cd19c31-4252-4de7-a673-9da7aedcb785"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.570830 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.570861 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.570884 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.570894 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8qz\" (UniqueName: \"kubernetes.io/projected/8cd19c31-4252-4de7-a673-9da7aedcb785-kube-api-access-cx8qz\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:32 crc kubenswrapper[4870]: I0130 08:27:32.570905 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd19c31-4252-4de7-a673-9da7aedcb785-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.243239 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" event={"ID":"8cd19c31-4252-4de7-a673-9da7aedcb785","Type":"ContainerDied","Data":"374b146ad8265eb6041ff5f1143dd86432961e72a020212acd84189e8d8f2978"} Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.243742 4870 scope.go:117] "RemoveContainer" containerID="137ef8da742a762887455130866543407aab4e626fc693e72bbf0ba327725c4f" Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.243283 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cc6fcf45-r6b9m" Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.252880 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.279493 4870 scope.go:117] "RemoveContainer" containerID="fa0e8e29630ac45ae5392bdda60293a38298eb7a8fb05baa4e216154fe19f932" Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.354205 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:27:33 crc kubenswrapper[4870]: I0130 08:27:33.365682 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65cc6fcf45-r6b9m"] Jan 30 08:27:34 crc kubenswrapper[4870]: I0130 08:27:34.091103 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" path="/var/lib/kubelet/pods/8cd19c31-4252-4de7-a673-9da7aedcb785/volumes" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.515287 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.870053 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.885107 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kqrrr"] Jan 30 08:27:35 crc kubenswrapper[4870]: E0130 08:27:35.885418 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="dnsmasq-dns" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.885435 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="dnsmasq-dns" Jan 30 08:27:35 crc kubenswrapper[4870]: E0130 08:27:35.885449 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="init" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.885456 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="init" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.885599 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd19c31-4252-4de7-a673-9da7aedcb785" containerName="dnsmasq-dns" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.886146 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.897829 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kqrrr"] Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.946602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.946662 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpk7\" (UniqueName: \"kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.978546 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6lzp5"] Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.982665 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:35 crc kubenswrapper[4870]: I0130 08:27:35.988814 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6lzp5"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.047944 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.047991 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.048028 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpk7\" (UniqueName: \"kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.048061 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlw25\" (UniqueName: \"kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.048671 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.072533 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpk7\" (UniqueName: \"kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7\") pod \"cinder-db-create-kqrrr\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.119791 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-937e-account-create-update-6w49r"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.120984 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.131404 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-937e-account-create-update-6w49r"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.131542 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.148993 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvx9\" (UniqueName: \"kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.149064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlw25\" (UniqueName: \"kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.149161 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.149231 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.149909 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.181513 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlw25\" (UniqueName: \"kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25\") pod \"barbican-db-create-6lzp5\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.181577 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f6r68"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.182623 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.186849 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.187132 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vn7b5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.187253 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.187519 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.191194 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f6r68"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.197499 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0515-account-create-update-rln5d"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.198524 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.200452 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.200914 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.207333 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0515-account-create-update-rln5d"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.231108 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250415 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250480 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250501 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250579 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78lm\" (UniqueName: \"kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250624 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknsc\" (UniqueName: \"kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250644 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htvx9\" (UniqueName: \"kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.250691 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.251968 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.275347 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htvx9\" (UniqueName: \"kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9\") pod \"cinder-937e-account-create-update-6w49r\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.296507 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.351958 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78lm\" (UniqueName: \"kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.352051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wknsc\" (UniqueName: \"kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.352105 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.352186 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.352222 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.353066 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.359064 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.363769 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.377822 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknsc\" (UniqueName: \"kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc\") pod \"keystone-db-sync-f6r68\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.393509 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78lm\" (UniqueName: \"kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm\") pod \"barbican-0515-account-create-update-rln5d\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.444822 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.544273 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.617237 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.792817 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kqrrr"] Jan 30 08:27:36 crc kubenswrapper[4870]: I0130 08:27:36.857503 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6lzp5"] Jan 30 08:27:36 crc kubenswrapper[4870]: W0130 08:27:36.873362 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d425622_da05_4988_a059_013c06b4ecf1.slice/crio-443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938 WatchSource:0}: Error finding container 443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938: Status 404 returned error can't find the container with id 443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938 Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.044715 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0515-account-create-update-rln5d"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.072269 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-937e-account-create-update-6w49r"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.172360 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f6r68"] Jan 30 08:27:38 crc kubenswrapper[4870]: W0130 08:27:37.186541 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod881527d5_776b_4639_9306_895d1e370abd.slice/crio-a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e WatchSource:0}: Error finding container a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e: Status 404 returned error can't find the container with id a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.298390 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f6r68" event={"ID":"881527d5-776b-4639-9306-895d1e370abd","Type":"ContainerStarted","Data":"a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.307849 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-937e-account-create-update-6w49r" event={"ID":"17e1f740-4393-4ba2-8242-fb863196cb02","Type":"ContainerStarted","Data":"b9684209c89c1359375b31da44bbfc4622187e78936d27a08d572996df752ae7"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.311747 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6lzp5" event={"ID":"4d425622-da05-4988-a059-013c06b4ecf1","Type":"ContainerStarted","Data":"ce2685881a857cd53a444b13c2f7aef4bd6f5c6b26f0a8cbc8a0c60a7f826c60"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.311785 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6lzp5" event={"ID":"4d425622-da05-4988-a059-013c06b4ecf1","Type":"ContainerStarted","Data":"443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.314531 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0515-account-create-update-rln5d" event={"ID":"19155d05-01da-4e21-96c2-f23662f8f785","Type":"ContainerStarted","Data":"3133fff318af851e54e7933fe20396b64a992bab1d22e8aca788bbc77160af37"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.317073 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kqrrr" event={"ID":"59f46507-531f-4d06-86d9-6c07a50abc6d","Type":"ContainerStarted","Data":"c19fa8ba72448fbf848d632a4b2c87c38ba00d3573897003f02d36a9263593ff"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.317106 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kqrrr" event={"ID":"59f46507-531f-4d06-86d9-6c07a50abc6d","Type":"ContainerStarted","Data":"f141a3af47237c42daeb2dac21dd12e35e4144a583e7317c4500c4b27418e250"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.335137 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6lzp5" podStartSLOduration=2.335118903 podStartE2EDuration="2.335118903s" podCreationTimestamp="2026-01-30 08:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:37.326219861 +0000 UTC m=+1096.021766970" watchObservedRunningTime="2026-01-30 08:27:37.335118903 +0000 UTC m=+1096.030666012" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:37.352483 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-kqrrr" podStartSLOduration=2.352465547 podStartE2EDuration="2.352465547s" podCreationTimestamp="2026-01-30 08:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:37.348463069 +0000 UTC m=+1096.044010188" watchObservedRunningTime="2026-01-30 08:27:37.352465547 +0000 UTC m=+1096.048012656" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.326679 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-937e-account-create-update-6w49r" event={"ID":"17e1f740-4393-4ba2-8242-fb863196cb02","Type":"ContainerStarted","Data":"ebb6defef32112bcd4f761a254fe06dd72ca1e2b11d0f09023e3983d12f747be"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.328626 4870 generic.go:334] "Generic (PLEG): container finished" podID="4d425622-da05-4988-a059-013c06b4ecf1" containerID="ce2685881a857cd53a444b13c2f7aef4bd6f5c6b26f0a8cbc8a0c60a7f826c60" exitCode=0 Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.328703 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6lzp5" event={"ID":"4d425622-da05-4988-a059-013c06b4ecf1","Type":"ContainerDied","Data":"ce2685881a857cd53a444b13c2f7aef4bd6f5c6b26f0a8cbc8a0c60a7f826c60"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.330677 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0515-account-create-update-rln5d" event={"ID":"19155d05-01da-4e21-96c2-f23662f8f785","Type":"ContainerStarted","Data":"6dcb2a606401562e049d19a34d68af34e28fc99c34413a4f7cfddf60bc5211ee"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.334378 4870 generic.go:334] "Generic (PLEG): container finished" podID="59f46507-531f-4d06-86d9-6c07a50abc6d" containerID="c19fa8ba72448fbf848d632a4b2c87c38ba00d3573897003f02d36a9263593ff" exitCode=0 Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.334418 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kqrrr" event={"ID":"59f46507-531f-4d06-86d9-6c07a50abc6d","Type":"ContainerDied","Data":"c19fa8ba72448fbf848d632a4b2c87c38ba00d3573897003f02d36a9263593ff"} Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.357435 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-937e-account-create-update-6w49r" podStartSLOduration=2.357420316 podStartE2EDuration="2.357420316s" podCreationTimestamp="2026-01-30 08:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:38.342205436 +0000 UTC m=+1097.037752545" watchObservedRunningTime="2026-01-30 08:27:38.357420316 +0000 UTC m=+1097.052967425" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.395013 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0515-account-create-update-rln5d" podStartSLOduration=2.3949898689999998 podStartE2EDuration="2.394989869s" podCreationTimestamp="2026-01-30 08:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:38.390809754 +0000 UTC m=+1097.086356863" watchObservedRunningTime="2026-01-30 08:27:38.394989869 +0000 UTC m=+1097.090536978" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.476221 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8td6r"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.477266 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.486791 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8td6r"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.541535 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-gbfzh"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.542649 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.547506 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.547824 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-b9kpk" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.557355 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-gbfzh"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.603549 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.603597 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f4zh\" (UniqueName: \"kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.612053 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9d1f-account-create-update-mffzg"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.613379 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.617465 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.638864 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9d1f-account-create-update-mffzg"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.689000 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xrsjh"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.690102 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.695376 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xrsjh"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742233 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42t4\" (UniqueName: \"kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742386 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742413 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742447 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742501 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f4zh\" (UniqueName: \"kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742544 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742594 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742645 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742702 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbn2\" (UniqueName: \"kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.742776 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5894m\" (UniqueName: \"kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.743817 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.778625 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6de9-account-create-update-nwcgl"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.780112 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.796414 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.796458 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f4zh\" (UniqueName: \"kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh\") pod \"glance-db-create-8td6r\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.809237 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6de9-account-create-update-nwcgl"] Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.846372 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.846648 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbn2\" (UniqueName: \"kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.846820 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5894m\" (UniqueName: \"kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.847896 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42t4\" (UniqueName: \"kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.848051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.848151 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.848246 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.848328 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.848594 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.849595 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.857839 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.859127 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.863628 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.869324 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5894m\" (UniqueName: \"kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m\") pod \"glance-9d1f-account-create-update-mffzg\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.872881 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42t4\" (UniqueName: \"kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4\") pod \"neutron-db-create-xrsjh\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.875103 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8td6r" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.877460 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbn2\" (UniqueName: \"kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2\") pod \"watcher-db-sync-gbfzh\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.886503 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.952792 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.953834 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmq87\" (UniqueName: \"kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:38 crc kubenswrapper[4870]: I0130 08:27:38.956862 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.055091 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.055210 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmq87\" (UniqueName: \"kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.055820 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.058161 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.082686 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmq87\" (UniqueName: \"kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87\") pod \"neutron-6de9-account-create-update-nwcgl\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.130585 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.213515 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8td6r"] Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.350479 4870 generic.go:334] "Generic (PLEG): container finished" podID="17e1f740-4393-4ba2-8242-fb863196cb02" containerID="ebb6defef32112bcd4f761a254fe06dd72ca1e2b11d0f09023e3983d12f747be" exitCode=0 Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.350519 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-937e-account-create-update-6w49r" event={"ID":"17e1f740-4393-4ba2-8242-fb863196cb02","Type":"ContainerDied","Data":"ebb6defef32112bcd4f761a254fe06dd72ca1e2b11d0f09023e3983d12f747be"} Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.352568 4870 generic.go:334] "Generic (PLEG): container finished" podID="19155d05-01da-4e21-96c2-f23662f8f785" containerID="6dcb2a606401562e049d19a34d68af34e28fc99c34413a4f7cfddf60bc5211ee" exitCode=0 Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.352633 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0515-account-create-update-rln5d" event={"ID":"19155d05-01da-4e21-96c2-f23662f8f785","Type":"ContainerDied","Data":"6dcb2a606401562e049d19a34d68af34e28fc99c34413a4f7cfddf60bc5211ee"} Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.357190 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8td6r" event={"ID":"dfc35112-b552-434a-b702-26c53cbf5574","Type":"ContainerStarted","Data":"d42477892b3ddaabcfbb181315d9bfde068e17d9e265081ee29d48545444f115"} Jan 30 08:27:39 crc kubenswrapper[4870]: W0130 08:27:39.480991 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8637667_8b7e_455e_8ba9_b6291574e4ce.slice/crio-25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd WatchSource:0}: Error finding container 25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd: Status 404 returned error can't find the container with id 25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.487731 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-gbfzh"] Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.526622 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6de9-account-create-update-nwcgl"] Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.537557 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9d1f-account-create-update-mffzg"] Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.622269 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xrsjh"] Jan 30 08:27:39 crc kubenswrapper[4870]: I0130 08:27:39.983225 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.010928 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.089107 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts\") pod \"59f46507-531f-4d06-86d9-6c07a50abc6d\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.089204 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkpk7\" (UniqueName: \"kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7\") pod \"59f46507-531f-4d06-86d9-6c07a50abc6d\" (UID: \"59f46507-531f-4d06-86d9-6c07a50abc6d\") " Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.089292 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts\") pod \"4d425622-da05-4988-a059-013c06b4ecf1\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.089368 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlw25\" (UniqueName: \"kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25\") pod \"4d425622-da05-4988-a059-013c06b4ecf1\" (UID: \"4d425622-da05-4988-a059-013c06b4ecf1\") " Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.090006 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59f46507-531f-4d06-86d9-6c07a50abc6d" (UID: "59f46507-531f-4d06-86d9-6c07a50abc6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.090709 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d425622-da05-4988-a059-013c06b4ecf1" (UID: "4d425622-da05-4988-a059-013c06b4ecf1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.094705 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7" (OuterVolumeSpecName: "kube-api-access-tkpk7") pod "59f46507-531f-4d06-86d9-6c07a50abc6d" (UID: "59f46507-531f-4d06-86d9-6c07a50abc6d"). InnerVolumeSpecName "kube-api-access-tkpk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.096509 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25" (OuterVolumeSpecName: "kube-api-access-vlw25") pod "4d425622-da05-4988-a059-013c06b4ecf1" (UID: "4d425622-da05-4988-a059-013c06b4ecf1"). InnerVolumeSpecName "kube-api-access-vlw25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.191258 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkpk7\" (UniqueName: \"kubernetes.io/projected/59f46507-531f-4d06-86d9-6c07a50abc6d-kube-api-access-tkpk7\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.191292 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d425622-da05-4988-a059-013c06b4ecf1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.191306 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlw25\" (UniqueName: \"kubernetes.io/projected/4d425622-da05-4988-a059-013c06b4ecf1-kube-api-access-vlw25\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.191315 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59f46507-531f-4d06-86d9-6c07a50abc6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.375023 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xrsjh" event={"ID":"051874aa-a01e-40bf-a987-a830886ea878","Type":"ContainerStarted","Data":"cb21d97629ca986697c0491abe322efbe6175a053c82fd76648eaa5b827fb2cc"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.375074 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xrsjh" event={"ID":"051874aa-a01e-40bf-a987-a830886ea878","Type":"ContainerStarted","Data":"6af75c88e6c19044721841617ae2fbc93efc4be1bd4b334456132a0a6bec8e0f"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.381179 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-gbfzh" event={"ID":"e8637667-8b7e-455e-8ba9-b6291574e4ce","Type":"ContainerStarted","Data":"25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.396503 4870 generic.go:334] "Generic (PLEG): container finished" podID="dfc35112-b552-434a-b702-26c53cbf5574" containerID="b1933043ebcbf2051360c783e7b0fa2a563a6c4cee962802cf9d526f5fcd348c" exitCode=0 Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.396877 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8td6r" event={"ID":"dfc35112-b552-434a-b702-26c53cbf5574","Type":"ContainerDied","Data":"b1933043ebcbf2051360c783e7b0fa2a563a6c4cee962802cf9d526f5fcd348c"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.401658 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6lzp5" event={"ID":"4d425622-da05-4988-a059-013c06b4ecf1","Type":"ContainerDied","Data":"443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.401685 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443c9665d6a75766dd0f1a996e3ba08bc212823b23adcfb2048b16772860a938" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.401726 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6lzp5" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.403319 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9d1f-account-create-update-mffzg" event={"ID":"eb61b735-bf9c-4bf5-a5cf-1948435af72e","Type":"ContainerStarted","Data":"a86a16c99cbecfe80af026afdf8bc6eec15eafc6658e69be7a63babe7a18aa00"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.403344 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9d1f-account-create-update-mffzg" event={"ID":"eb61b735-bf9c-4bf5-a5cf-1948435af72e","Type":"ContainerStarted","Data":"784b44f85ae007ee5455ec0b1b0eefd1ddd8dc08aafa33e4bdb7deee1e8d44f2"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.405937 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kqrrr" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.405953 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kqrrr" event={"ID":"59f46507-531f-4d06-86d9-6c07a50abc6d","Type":"ContainerDied","Data":"f141a3af47237c42daeb2dac21dd12e35e4144a583e7317c4500c4b27418e250"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.406008 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f141a3af47237c42daeb2dac21dd12e35e4144a583e7317c4500c4b27418e250" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.410776 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-xrsjh" podStartSLOduration=2.41075229 podStartE2EDuration="2.41075229s" podCreationTimestamp="2026-01-30 08:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:40.393232851 +0000 UTC m=+1099.088779960" watchObservedRunningTime="2026-01-30 08:27:40.41075229 +0000 UTC m=+1099.106299399" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.414105 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6de9-account-create-update-nwcgl" event={"ID":"b6566e49-850d-460e-9a22-9bfd7384f494","Type":"ContainerStarted","Data":"96c828944b59ded4cdb603b725476894266a7134b9d788fbea6f5b49b309942a"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.415409 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6de9-account-create-update-nwcgl" event={"ID":"b6566e49-850d-460e-9a22-9bfd7384f494","Type":"ContainerStarted","Data":"ffc742f6a81620434ff82836c4c9e4cd30220b47560730abe3ffaf861348054f"} Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.432735 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9d1f-account-create-update-mffzg" podStartSLOduration=2.43271721 podStartE2EDuration="2.43271721s" podCreationTimestamp="2026-01-30 08:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:40.420191759 +0000 UTC m=+1099.115738868" watchObservedRunningTime="2026-01-30 08:27:40.43271721 +0000 UTC m=+1099.128264319" Jan 30 08:27:40 crc kubenswrapper[4870]: I0130 08:27:40.451352 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6de9-account-create-update-nwcgl" podStartSLOduration=2.45131723 podStartE2EDuration="2.45131723s" podCreationTimestamp="2026-01-30 08:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:40.440248793 +0000 UTC m=+1099.135795902" watchObservedRunningTime="2026-01-30 08:27:40.45131723 +0000 UTC m=+1099.146864339" Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.431169 4870 generic.go:334] "Generic (PLEG): container finished" podID="eb61b735-bf9c-4bf5-a5cf-1948435af72e" containerID="a86a16c99cbecfe80af026afdf8bc6eec15eafc6658e69be7a63babe7a18aa00" exitCode=0 Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.431272 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9d1f-account-create-update-mffzg" event={"ID":"eb61b735-bf9c-4bf5-a5cf-1948435af72e","Type":"ContainerDied","Data":"a86a16c99cbecfe80af026afdf8bc6eec15eafc6658e69be7a63babe7a18aa00"} Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.435937 4870 generic.go:334] "Generic (PLEG): container finished" podID="b6566e49-850d-460e-9a22-9bfd7384f494" containerID="96c828944b59ded4cdb603b725476894266a7134b9d788fbea6f5b49b309942a" exitCode=0 Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.435995 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6de9-account-create-update-nwcgl" event={"ID":"b6566e49-850d-460e-9a22-9bfd7384f494","Type":"ContainerDied","Data":"96c828944b59ded4cdb603b725476894266a7134b9d788fbea6f5b49b309942a"} Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.437819 4870 generic.go:334] "Generic (PLEG): container finished" podID="051874aa-a01e-40bf-a987-a830886ea878" containerID="cb21d97629ca986697c0491abe322efbe6175a053c82fd76648eaa5b827fb2cc" exitCode=0 Jan 30 08:27:41 crc kubenswrapper[4870]: I0130 08:27:41.437898 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xrsjh" event={"ID":"051874aa-a01e-40bf-a987-a830886ea878","Type":"ContainerDied","Data":"cb21d97629ca986697c0491abe322efbe6175a053c82fd76648eaa5b827fb2cc"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.147977 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8td6r" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.184240 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.191411 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.226719 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.234012 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.241795 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320238 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts\") pod \"19155d05-01da-4e21-96c2-f23662f8f785\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320323 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5894m\" (UniqueName: \"kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m\") pod \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320371 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78lm\" (UniqueName: \"kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm\") pod \"19155d05-01da-4e21-96c2-f23662f8f785\" (UID: \"19155d05-01da-4e21-96c2-f23662f8f785\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320391 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htvx9\" (UniqueName: \"kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9\") pod \"17e1f740-4393-4ba2-8242-fb863196cb02\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320411 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts\") pod \"dfc35112-b552-434a-b702-26c53cbf5574\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320429 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts\") pod \"051874aa-a01e-40bf-a987-a830886ea878\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320464 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts\") pod \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\" (UID: \"eb61b735-bf9c-4bf5-a5cf-1948435af72e\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320496 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts\") pod \"17e1f740-4393-4ba2-8242-fb863196cb02\" (UID: \"17e1f740-4393-4ba2-8242-fb863196cb02\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320557 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b42t4\" (UniqueName: \"kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4\") pod \"051874aa-a01e-40bf-a987-a830886ea878\" (UID: \"051874aa-a01e-40bf-a987-a830886ea878\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.320592 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f4zh\" (UniqueName: \"kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh\") pod \"dfc35112-b552-434a-b702-26c53cbf5574\" (UID: \"dfc35112-b552-434a-b702-26c53cbf5574\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.321942 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfc35112-b552-434a-b702-26c53cbf5574" (UID: "dfc35112-b552-434a-b702-26c53cbf5574"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.321939 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19155d05-01da-4e21-96c2-f23662f8f785" (UID: "19155d05-01da-4e21-96c2-f23662f8f785"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.321950 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "051874aa-a01e-40bf-a987-a830886ea878" (UID: "051874aa-a01e-40bf-a987-a830886ea878"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.322504 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb61b735-bf9c-4bf5-a5cf-1948435af72e" (UID: "eb61b735-bf9c-4bf5-a5cf-1948435af72e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.323155 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17e1f740-4393-4ba2-8242-fb863196cb02" (UID: "17e1f740-4393-4ba2-8242-fb863196cb02"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.327757 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4" (OuterVolumeSpecName: "kube-api-access-b42t4") pod "051874aa-a01e-40bf-a987-a830886ea878" (UID: "051874aa-a01e-40bf-a987-a830886ea878"). InnerVolumeSpecName "kube-api-access-b42t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.329248 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9" (OuterVolumeSpecName: "kube-api-access-htvx9") pod "17e1f740-4393-4ba2-8242-fb863196cb02" (UID: "17e1f740-4393-4ba2-8242-fb863196cb02"). InnerVolumeSpecName "kube-api-access-htvx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.329397 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m" (OuterVolumeSpecName: "kube-api-access-5894m") pod "eb61b735-bf9c-4bf5-a5cf-1948435af72e" (UID: "eb61b735-bf9c-4bf5-a5cf-1948435af72e"). InnerVolumeSpecName "kube-api-access-5894m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.329423 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh" (OuterVolumeSpecName: "kube-api-access-8f4zh") pod "dfc35112-b552-434a-b702-26c53cbf5574" (UID: "dfc35112-b552-434a-b702-26c53cbf5574"). InnerVolumeSpecName "kube-api-access-8f4zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.329790 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm" (OuterVolumeSpecName: "kube-api-access-d78lm") pod "19155d05-01da-4e21-96c2-f23662f8f785" (UID: "19155d05-01da-4e21-96c2-f23662f8f785"). InnerVolumeSpecName "kube-api-access-d78lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.421685 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmq87\" (UniqueName: \"kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87\") pod \"b6566e49-850d-460e-9a22-9bfd7384f494\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.421951 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts\") pod \"b6566e49-850d-460e-9a22-9bfd7384f494\" (UID: \"b6566e49-850d-460e-9a22-9bfd7384f494\") " Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422499 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6566e49-850d-460e-9a22-9bfd7384f494" (UID: "b6566e49-850d-460e-9a22-9bfd7384f494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422775 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b42t4\" (UniqueName: \"kubernetes.io/projected/051874aa-a01e-40bf-a987-a830886ea878-kube-api-access-b42t4\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422795 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f4zh\" (UniqueName: \"kubernetes.io/projected/dfc35112-b552-434a-b702-26c53cbf5574-kube-api-access-8f4zh\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422807 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19155d05-01da-4e21-96c2-f23662f8f785-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422816 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5894m\" (UniqueName: \"kubernetes.io/projected/eb61b735-bf9c-4bf5-a5cf-1948435af72e-kube-api-access-5894m\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422825 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78lm\" (UniqueName: \"kubernetes.io/projected/19155d05-01da-4e21-96c2-f23662f8f785-kube-api-access-d78lm\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422834 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htvx9\" (UniqueName: \"kubernetes.io/projected/17e1f740-4393-4ba2-8242-fb863196cb02-kube-api-access-htvx9\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422842 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfc35112-b552-434a-b702-26c53cbf5574-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422851 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6566e49-850d-460e-9a22-9bfd7384f494-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422859 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/051874aa-a01e-40bf-a987-a830886ea878-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422868 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb61b735-bf9c-4bf5-a5cf-1948435af72e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.422893 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e1f740-4393-4ba2-8242-fb863196cb02-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.426782 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87" (OuterVolumeSpecName: "kube-api-access-nmq87") pod "b6566e49-850d-460e-9a22-9bfd7384f494" (UID: "b6566e49-850d-460e-9a22-9bfd7384f494"). InnerVolumeSpecName "kube-api-access-nmq87". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.498375 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9d1f-account-create-update-mffzg" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.498383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9d1f-account-create-update-mffzg" event={"ID":"eb61b735-bf9c-4bf5-a5cf-1948435af72e","Type":"ContainerDied","Data":"784b44f85ae007ee5455ec0b1b0eefd1ddd8dc08aafa33e4bdb7deee1e8d44f2"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.500506 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="784b44f85ae007ee5455ec0b1b0eefd1ddd8dc08aafa33e4bdb7deee1e8d44f2" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.501353 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0515-account-create-update-rln5d" event={"ID":"19155d05-01da-4e21-96c2-f23662f8f785","Type":"ContainerDied","Data":"3133fff318af851e54e7933fe20396b64a992bab1d22e8aca788bbc77160af37"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.501396 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3133fff318af851e54e7933fe20396b64a992bab1d22e8aca788bbc77160af37" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.501411 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0515-account-create-update-rln5d" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.504241 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f6r68" event={"ID":"881527d5-776b-4639-9306-895d1e370abd","Type":"ContainerStarted","Data":"c7c08f4bc1bd775e569c12ce6f45113dd74be7d4b1436663db01b3cc4e31c119"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.508330 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6de9-account-create-update-nwcgl" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.508412 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6de9-account-create-update-nwcgl" event={"ID":"b6566e49-850d-460e-9a22-9bfd7384f494","Type":"ContainerDied","Data":"ffc742f6a81620434ff82836c4c9e4cd30220b47560730abe3ffaf861348054f"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.508484 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc742f6a81620434ff82836c4c9e4cd30220b47560730abe3ffaf861348054f" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.514942 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xrsjh" event={"ID":"051874aa-a01e-40bf-a987-a830886ea878","Type":"ContainerDied","Data":"6af75c88e6c19044721841617ae2fbc93efc4be1bd4b334456132a0a6bec8e0f"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.515015 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af75c88e6c19044721841617ae2fbc93efc4be1bd4b334456132a0a6bec8e0f" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.515148 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xrsjh" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.523795 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-gbfzh" event={"ID":"e8637667-8b7e-455e-8ba9-b6291574e4ce","Type":"ContainerStarted","Data":"0e37f5a9ce757405b6d3e4a3c9aee5ca81c0dd18541f09603a2ca8623a81a084"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.527229 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmq87\" (UniqueName: \"kubernetes.io/projected/b6566e49-850d-460e-9a22-9bfd7384f494-kube-api-access-nmq87\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.530803 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f6r68" podStartSLOduration=1.816888226 podStartE2EDuration="11.530782351s" podCreationTimestamp="2026-01-30 08:27:36 +0000 UTC" firstStartedPulling="2026-01-30 08:27:37.189218014 +0000 UTC m=+1095.884765123" lastFinishedPulling="2026-01-30 08:27:46.903112129 +0000 UTC m=+1105.598659248" observedRunningTime="2026-01-30 08:27:47.522342868 +0000 UTC m=+1106.217890017" watchObservedRunningTime="2026-01-30 08:27:47.530782351 +0000 UTC m=+1106.226329470" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.532005 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8td6r" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.532126 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8td6r" event={"ID":"dfc35112-b552-434a-b702-26c53cbf5574","Type":"ContainerDied","Data":"d42477892b3ddaabcfbb181315d9bfde068e17d9e265081ee29d48545444f115"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.532180 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42477892b3ddaabcfbb181315d9bfde068e17d9e265081ee29d48545444f115" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.536130 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-937e-account-create-update-6w49r" event={"ID":"17e1f740-4393-4ba2-8242-fb863196cb02","Type":"ContainerDied","Data":"b9684209c89c1359375b31da44bbfc4622187e78936d27a08d572996df752ae7"} Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.536167 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9684209c89c1359375b31da44bbfc4622187e78936d27a08d572996df752ae7" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.536228 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-937e-account-create-update-6w49r" Jan 30 08:27:47 crc kubenswrapper[4870]: I0130 08:27:47.556576 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-gbfzh" podStartSLOduration=2.109261327 podStartE2EDuration="9.556556132s" podCreationTimestamp="2026-01-30 08:27:38 +0000 UTC" firstStartedPulling="2026-01-30 08:27:39.487677774 +0000 UTC m=+1098.183224883" lastFinishedPulling="2026-01-30 08:27:46.934872366 +0000 UTC m=+1105.630519688" observedRunningTime="2026-01-30 08:27:47.548591115 +0000 UTC m=+1106.244138244" watchObservedRunningTime="2026-01-30 08:27:47.556556132 +0000 UTC m=+1106.252103241" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896121 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tssp8"] Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896792 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e1f740-4393-4ba2-8242-fb863196cb02" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896808 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e1f740-4393-4ba2-8242-fb863196cb02" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896826 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f46507-531f-4d06-86d9-6c07a50abc6d" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896834 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f46507-531f-4d06-86d9-6c07a50abc6d" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896845 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc35112-b552-434a-b702-26c53cbf5574" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896856 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc35112-b552-434a-b702-26c53cbf5574" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896868 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61b735-bf9c-4bf5-a5cf-1948435af72e" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896894 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61b735-bf9c-4bf5-a5cf-1948435af72e" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896908 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6566e49-850d-460e-9a22-9bfd7384f494" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896916 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6566e49-850d-460e-9a22-9bfd7384f494" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896932 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19155d05-01da-4e21-96c2-f23662f8f785" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896940 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="19155d05-01da-4e21-96c2-f23662f8f785" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896968 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="051874aa-a01e-40bf-a987-a830886ea878" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896975 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="051874aa-a01e-40bf-a987-a830886ea878" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: E0130 08:27:48.896989 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d425622-da05-4988-a059-013c06b4ecf1" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.896997 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d425622-da05-4988-a059-013c06b4ecf1" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897209 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e1f740-4393-4ba2-8242-fb863196cb02" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897242 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb61b735-bf9c-4bf5-a5cf-1948435af72e" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897267 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="051874aa-a01e-40bf-a987-a830886ea878" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897288 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f46507-531f-4d06-86d9-6c07a50abc6d" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897297 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="19155d05-01da-4e21-96c2-f23662f8f785" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897316 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc35112-b552-434a-b702-26c53cbf5574" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897340 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d425622-da05-4988-a059-013c06b4ecf1" containerName="mariadb-database-create" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.897353 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6566e49-850d-460e-9a22-9bfd7384f494" containerName="mariadb-account-create-update" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.898031 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.901272 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.901431 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-58ht6" Jan 30 08:27:48 crc kubenswrapper[4870]: I0130 08:27:48.906065 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tssp8"] Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.054160 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.054254 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8n8\" (UniqueName: \"kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.054423 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.054469 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.155604 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8n8\" (UniqueName: \"kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.155784 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.155852 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.155922 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.160027 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.161329 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.161858 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.184420 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8n8\" (UniqueName: \"kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8\") pod \"glance-db-sync-tssp8\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:49 crc kubenswrapper[4870]: I0130 08:27:49.253778 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tssp8" Jan 30 08:27:50 crc kubenswrapper[4870]: I0130 08:27:50.277717 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tssp8"] Jan 30 08:27:50 crc kubenswrapper[4870]: I0130 08:27:50.562914 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tssp8" event={"ID":"edd09a42-14b6-4161-ba2a-82c4cf4f5983","Type":"ContainerStarted","Data":"8bfad17f6d235c11635a3d5c597e4e8cad4341b4b72906e827a01ca540cffaac"} Jan 30 08:27:51 crc kubenswrapper[4870]: I0130 08:27:51.573104 4870 generic.go:334] "Generic (PLEG): container finished" podID="e8637667-8b7e-455e-8ba9-b6291574e4ce" containerID="0e37f5a9ce757405b6d3e4a3c9aee5ca81c0dd18541f09603a2ca8623a81a084" exitCode=0 Jan 30 08:27:51 crc kubenswrapper[4870]: I0130 08:27:51.573206 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-gbfzh" event={"ID":"e8637667-8b7e-455e-8ba9-b6291574e4ce","Type":"ContainerDied","Data":"0e37f5a9ce757405b6d3e4a3c9aee5ca81c0dd18541f09603a2ca8623a81a084"} Jan 30 08:27:52 crc kubenswrapper[4870]: I0130 08:27:52.584477 4870 generic.go:334] "Generic (PLEG): container finished" podID="881527d5-776b-4639-9306-895d1e370abd" containerID="c7c08f4bc1bd775e569c12ce6f45113dd74be7d4b1436663db01b3cc4e31c119" exitCode=0 Jan 30 08:27:52 crc kubenswrapper[4870]: I0130 08:27:52.584730 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f6r68" event={"ID":"881527d5-776b-4639-9306-895d1e370abd","Type":"ContainerDied","Data":"c7c08f4bc1bd775e569c12ce6f45113dd74be7d4b1436663db01b3cc4e31c119"} Jan 30 08:27:52 crc kubenswrapper[4870]: I0130 08:27:52.960597 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.018353 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data\") pod \"e8637667-8b7e-455e-8ba9-b6291574e4ce\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.019053 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle\") pod \"e8637667-8b7e-455e-8ba9-b6291574e4ce\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.019094 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbn2\" (UniqueName: \"kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2\") pod \"e8637667-8b7e-455e-8ba9-b6291574e4ce\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.019201 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data\") pod \"e8637667-8b7e-455e-8ba9-b6291574e4ce\" (UID: \"e8637667-8b7e-455e-8ba9-b6291574e4ce\") " Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.027072 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e8637667-8b7e-455e-8ba9-b6291574e4ce" (UID: "e8637667-8b7e-455e-8ba9-b6291574e4ce"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.028133 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2" (OuterVolumeSpecName: "kube-api-access-xfbn2") pod "e8637667-8b7e-455e-8ba9-b6291574e4ce" (UID: "e8637667-8b7e-455e-8ba9-b6291574e4ce"). InnerVolumeSpecName "kube-api-access-xfbn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.062724 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8637667-8b7e-455e-8ba9-b6291574e4ce" (UID: "e8637667-8b7e-455e-8ba9-b6291574e4ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.082522 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data" (OuterVolumeSpecName: "config-data") pod "e8637667-8b7e-455e-8ba9-b6291574e4ce" (UID: "e8637667-8b7e-455e-8ba9-b6291574e4ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.121694 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.121737 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbn2\" (UniqueName: \"kubernetes.io/projected/e8637667-8b7e-455e-8ba9-b6291574e4ce-kube-api-access-xfbn2\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.121752 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.121766 4870 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8637667-8b7e-455e-8ba9-b6291574e4ce-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.596144 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-gbfzh" event={"ID":"e8637667-8b7e-455e-8ba9-b6291574e4ce","Type":"ContainerDied","Data":"25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd"} Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.596197 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fc46aae9550fd891b36bb65004fc5f285237547119046fa6127a16efed7dfd" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.596216 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-gbfzh" Jan 30 08:27:53 crc kubenswrapper[4870]: I0130 08:27:53.961655 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.037779 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data\") pod \"881527d5-776b-4639-9306-895d1e370abd\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.037841 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wknsc\" (UniqueName: \"kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc\") pod \"881527d5-776b-4639-9306-895d1e370abd\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.037932 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle\") pod \"881527d5-776b-4639-9306-895d1e370abd\" (UID: \"881527d5-776b-4639-9306-895d1e370abd\") " Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.044020 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc" (OuterVolumeSpecName: "kube-api-access-wknsc") pod "881527d5-776b-4639-9306-895d1e370abd" (UID: "881527d5-776b-4639-9306-895d1e370abd"). InnerVolumeSpecName "kube-api-access-wknsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.062015 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "881527d5-776b-4639-9306-895d1e370abd" (UID: "881527d5-776b-4639-9306-895d1e370abd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.084961 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data" (OuterVolumeSpecName: "config-data") pod "881527d5-776b-4639-9306-895d1e370abd" (UID: "881527d5-776b-4639-9306-895d1e370abd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.140338 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.140374 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881527d5-776b-4639-9306-895d1e370abd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.140384 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wknsc\" (UniqueName: \"kubernetes.io/projected/881527d5-776b-4639-9306-895d1e370abd-kube-api-access-wknsc\") on node \"crc\" DevicePath \"\"" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.615141 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f6r68" event={"ID":"881527d5-776b-4639-9306-895d1e370abd","Type":"ContainerDied","Data":"a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e"} Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.615217 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31484a12f791686023b40843494f6f2ece996f4e78a4c51e74f6a74ad7d512e" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.617743 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f6r68" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.859645 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:27:54 crc kubenswrapper[4870]: E0130 08:27:54.860086 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881527d5-776b-4639-9306-895d1e370abd" containerName="keystone-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.860100 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="881527d5-776b-4639-9306-895d1e370abd" containerName="keystone-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: E0130 08:27:54.860115 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8637667-8b7e-455e-8ba9-b6291574e4ce" containerName="watcher-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.860121 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8637667-8b7e-455e-8ba9-b6291574e4ce" containerName="watcher-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.860313 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8637667-8b7e-455e-8ba9-b6291574e4ce" containerName="watcher-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.860326 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="881527d5-776b-4639-9306-895d1e370abd" containerName="keystone-db-sync" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.861213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.882222 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.892757 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vd7q8"] Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.898305 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.904143 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vd7q8"] Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.905058 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vn7b5" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.905437 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.906555 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.906766 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.906868 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.979593 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.980214 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nf46\" (UniqueName: \"kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.980317 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.980395 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.980563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.980701 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981045 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8jg\" (UniqueName: \"kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981138 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:54 crc kubenswrapper[4870]: I0130 08:27:54.981663 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.050609 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.052155 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.061543 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.075486 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-b9kpk" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085091 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085289 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085359 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085443 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085508 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nf46\" (UniqueName: \"kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085586 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.085654 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.086421 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.086496 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.086575 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8jg\" (UniqueName: \"kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.086661 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.086741 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.087665 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.088020 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.088610 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.088858 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.089597 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.096829 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.103349 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.114832 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.115039 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.125480 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.133179 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.148247 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.150156 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nf46\" (UniqueName: \"kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.151773 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.157451 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.163707 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8jg\" (UniqueName: \"kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg\") pod \"dnsmasq-dns-5bb457dfc5-294j8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.165099 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts\") pod \"keystone-bootstrap-vd7q8\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.190145 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqd86\" (UniqueName: \"kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.190184 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.190220 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.190282 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.191509 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.193358 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.198511 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.202820 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.207982 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.222508 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.223984 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.235429 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.240054 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.240097 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.240061 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-brkzs" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.240290 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.249460 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.249514 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.266625 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9g27p"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.267762 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.289667 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.289839 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.289961 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4blb4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291401 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gsr9\" (UniqueName: \"kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291440 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291464 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291483 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291501 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291517 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291538 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291558 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291573 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291601 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291618 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291648 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gl4q\" (UniqueName: \"kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291670 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291708 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291725 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskqz\" (UniqueName: \"kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291747 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291769 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqd86\" (UniqueName: \"kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.291809 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.292190 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.302837 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.310940 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.311496 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.322676 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqd86\" (UniqueName: \"kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86\") pod \"watcher-applier-0\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.357013 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9g27p"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.387252 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9mjj4"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.388789 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.397481 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.397600 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.397774 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nnfmm" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.409799 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.409844 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.409897 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskqz\" (UniqueName: \"kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.409946 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.409976 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410001 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410040 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgl2s\" (UniqueName: \"kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410087 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gsr9\" (UniqueName: \"kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410106 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410122 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410146 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410163 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410177 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410200 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410255 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410275 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410313 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410343 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410399 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gl4q\" (UniqueName: \"kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.410429 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.411300 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.412233 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.412248 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.412715 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.413314 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.416556 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.418194 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.419472 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.419540 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.420362 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.435376 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.437049 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.437772 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.438043 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.438263 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.439227 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.449207 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskqz\" (UniqueName: \"kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz\") pod \"watcher-decision-engine-0\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.453585 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.453661 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9mjj4"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.467289 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gl4q\" (UniqueName: \"kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q\") pod \"horizon-6d56cb75f7-5b6cr\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.467568 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gsr9\" (UniqueName: \"kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9\") pod \"watcher-api-0\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.470163 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-d2mx7"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.471264 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.474829 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmdf5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.475100 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.481965 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.490099 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-b57k5"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.491708 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.495180 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.495986 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.497532 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-skpxp" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.512449 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.512529 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.513844 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.513900 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.513919 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.513938 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwsc\" (UniqueName: \"kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514001 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514040 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514066 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdb9v\" (UniqueName: \"kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514142 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514272 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514296 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514454 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.514509 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgl2s\" (UniqueName: \"kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.516145 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.518563 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.518974 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.519263 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.519581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.520970 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d2mx7"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.524180 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.540713 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgl2s\" (UniqueName: \"kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s\") pod \"cinder-db-sync-9g27p\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.555133 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-b57k5"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.570788 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.575279 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.577597 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.579650 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.585145 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.594513 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615535 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615581 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615605 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmq48\" (UniqueName: \"kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615631 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615659 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615682 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615703 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615735 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615752 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwsc\" (UniqueName: \"kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615765 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615779 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615811 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pwt\" (UniqueName: \"kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615831 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615864 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdb9v\" (UniqueName: \"kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615939 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615959 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.615977 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.624845 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.626655 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.628850 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.633674 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.639769 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.649544 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.652416 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.653708 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.661213 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.666837 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.667859 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdb9v\" (UniqueName: \"kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.668772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9g27p" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.669533 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.676467 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwsc\" (UniqueName: \"kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc\") pod \"neutron-db-sync-9mjj4\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727002 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727269 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zw7\" (UniqueName: \"kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727309 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727347 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727402 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnppg\" (UniqueName: \"kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727464 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727535 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727577 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pwt\" (UniqueName: \"kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727601 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727621 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727662 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727685 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727718 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727771 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727817 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727860 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmq48\" (UniqueName: \"kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727908 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.727933 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.731168 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.732524 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.734384 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.735129 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.735782 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.735951 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.752966 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.756995 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pwt\" (UniqueName: \"kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt\") pod \"placement-db-sync-b57k5\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.762485 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmq48\" (UniqueName: \"kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48\") pod \"barbican-db-sync-d2mx7\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.772175 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.799100 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.820157 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b57k5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.831464 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.831558 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.831614 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zw7\" (UniqueName: \"kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.832796 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.832904 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnppg\" (UniqueName: \"kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.832980 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833011 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833065 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833098 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833120 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833150 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833196 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.833600 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.834283 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.834604 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.835822 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.838385 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.842308 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.843083 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.848214 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.852609 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnppg\" (UniqueName: \"kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg\") pod \"horizon-5fb6b548c7-56kg5\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.857911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zw7\" (UniqueName: \"kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7\") pod \"dnsmasq-dns-7fbb4d475f-66fsw\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.888592 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vd7q8"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.896381 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.897517 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:27:55 crc kubenswrapper[4870]: I0130 08:27:55.915108 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:27:55 crc kubenswrapper[4870]: W0130 08:27:55.985269 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc98de4d_b882_4f13_bc7e_1e6070ffd7d8.slice/crio-0596995a387e00d4157105eb8a5a484304961f9369875c777b77a509e3229e35 WatchSource:0}: Error finding container 0596995a387e00d4157105eb8a5a484304961f9369875c777b77a509e3229e35: Status 404 returned error can't find the container with id 0596995a387e00d4157105eb8a5a484304961f9369875c777b77a509e3229e35 Jan 30 08:27:56 crc kubenswrapper[4870]: W0130 08:27:56.010950 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38224a9d_ced6_4f76_8117_18e7ca7f33e7.slice/crio-bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571 WatchSource:0}: Error finding container bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571: Status 404 returned error can't find the container with id bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571 Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.209035 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.293338 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:27:56 crc kubenswrapper[4870]: W0130 08:27:56.345475 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb0ea94_f1b1_41c4_a968_ff1d4af60e2f.slice/crio-688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338 WatchSource:0}: Error finding container 688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338: Status 404 returned error can't find the container with id 688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338 Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.584536 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.687416 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9mjj4"] Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.687704 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vd7q8" event={"ID":"38224a9d-ced6-4f76-8117-18e7ca7f33e7","Type":"ContainerStarted","Data":"bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571"} Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.694094 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f","Type":"ContainerStarted","Data":"688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338"} Jan 30 08:27:56 crc kubenswrapper[4870]: W0130 08:27:56.696990 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod685bde78_dea1_4864_a825_af176178bd11.slice/crio-208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7 WatchSource:0}: Error finding container 208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7: Status 404 returned error can't find the container with id 208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7 Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.697261 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9g27p"] Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.698056 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d501bb9c-d88d-4362-a48e-4d0347ecc90e","Type":"ContainerStarted","Data":"d6051b0c5bd2d63d9f43ae131b460100c45ef28a76e95d9c82c1f29baab7429d"} Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.700425 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerStarted","Data":"ce3f73b3878e9503e479cb8deefa9a49d72c579fcd3b7d49136ba600b5e48a5d"} Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.702520 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" event={"ID":"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8","Type":"ContainerStarted","Data":"0596995a387e00d4157105eb8a5a484304961f9369875c777b77a509e3229e35"} Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.942180 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:27:56 crc kubenswrapper[4870]: W0130 08:27:56.945093 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fc2a1f3_54bc_4554_a413_69bc35b58a2f.slice/crio-7e31fa7de08f6fe5e038c395ec730d87de7703f0769687addc3ef103068cb495 WatchSource:0}: Error finding container 7e31fa7de08f6fe5e038c395ec730d87de7703f0769687addc3ef103068cb495: Status 404 returned error can't find the container with id 7e31fa7de08f6fe5e038c395ec730d87de7703f0769687addc3ef103068cb495 Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.965429 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:27:56 crc kubenswrapper[4870]: I0130 08:27:56.995367 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-b57k5"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.034944 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:27:57 crc kubenswrapper[4870]: W0130 08:27:57.051255 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf6a0a0_7d14_4cbd_96e4_c81ac5366fb2.slice/crio-0a2dbcce6be2e5137bdbc1dec4f8f525f5301e8818ebf37e38952868cb263db6 WatchSource:0}: Error finding container 0a2dbcce6be2e5137bdbc1dec4f8f525f5301e8818ebf37e38952868cb263db6: Status 404 returned error can't find the container with id 0a2dbcce6be2e5137bdbc1dec4f8f525f5301e8818ebf37e38952868cb263db6 Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.053003 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.060985 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-d2mx7"] Jan 30 08:27:57 crc kubenswrapper[4870]: W0130 08:27:57.065099 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3bd649e_5c3c_495f_933f_3b516167cbd2.slice/crio-545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66 WatchSource:0}: Error finding container 545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66: Status 404 returned error can't find the container with id 545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66 Jan 30 08:27:57 crc kubenswrapper[4870]: W0130 08:27:57.065930 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe9b9169_ab54_46ee_acb5_d1dc0047e59c.slice/crio-c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987 WatchSource:0}: Error finding container c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987: Status 404 returned error can't find the container with id c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987 Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.324110 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.338859 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.449937 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.454246 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.477822 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.492206 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.579415 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.579670 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj77z\" (UniqueName: \"kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.579735 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.579779 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.579812 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682305 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682398 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682430 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj77z\" (UniqueName: \"kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682485 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.682819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.683949 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.684094 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.689298 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.704182 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj77z\" (UniqueName: \"kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z\") pod \"horizon-5949fbc84f-vdxjp\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.733760 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b57k5" event={"ID":"1435e0c6-e24a-44d4-bf78-3e5300e23cdd","Type":"ContainerStarted","Data":"09157ba4d27c88f05b3bcbf87559e2f7cd18de58cca7f08d086b19254b605ef0"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.737306 4870 generic.go:334] "Generic (PLEG): container finished" podID="fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" containerID="4dee564f51791741ba4e6cb761d557f18b1c1b23b27913388a9dbed54cdf4c9e" exitCode=0 Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.737391 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" event={"ID":"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8","Type":"ContainerDied","Data":"4dee564f51791741ba4e6cb761d557f18b1c1b23b27913388a9dbed54cdf4c9e"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.742937 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9g27p" event={"ID":"685bde78-dea1-4864-a825-af176178bd11","Type":"ContainerStarted","Data":"208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.743980 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9mjj4" event={"ID":"505df376-c8bc-44ce-9c14-8cf94730c550","Type":"ContainerStarted","Data":"51fd04d1413a7bb8dd1010fcf50ab478d7211a73c87542e70aaae3ce82cc9053"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.744003 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9mjj4" event={"ID":"505df376-c8bc-44ce-9c14-8cf94730c550","Type":"ContainerStarted","Data":"171c5c07c9fb91c243425bc5e80be08d88ca8fe65555c57f93bf344f77f94faf"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.749136 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d56cb75f7-5b6cr" event={"ID":"547994a2-f3d5-4ac9-a025-2644e86fe00d","Type":"ContainerStarted","Data":"3856f7ed6f6f6e5768d9f69766b166a6b372b3a627f11ff19e5e79941f1444f8"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.751657 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerStarted","Data":"4613e64cbab9e5f1e4c63d6f62c12ca1aa12d4d56ce06a57b13b9b4dcc74559f"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.754611 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb6b548c7-56kg5" event={"ID":"fe9b9169-ab54-46ee-acb5-d1dc0047e59c","Type":"ContainerStarted","Data":"c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.755901 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d2mx7" event={"ID":"c3bd649e-5c3c-495f-933f-3b516167cbd2","Type":"ContainerStarted","Data":"545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.757223 4870 generic.go:334] "Generic (PLEG): container finished" podID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerID="38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c" exitCode=0 Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.757278 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" event={"ID":"7fc2a1f3-54bc-4554-a413-69bc35b58a2f","Type":"ContainerDied","Data":"38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.757297 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" event={"ID":"7fc2a1f3-54bc-4554-a413-69bc35b58a2f","Type":"ContainerStarted","Data":"7e31fa7de08f6fe5e038c395ec730d87de7703f0769687addc3ef103068cb495"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.767593 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerStarted","Data":"0a2dbcce6be2e5137bdbc1dec4f8f525f5301e8818ebf37e38952868cb263db6"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.769747 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9mjj4" podStartSLOduration=2.769733879 podStartE2EDuration="2.769733879s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:57.766433116 +0000 UTC m=+1116.461980225" watchObservedRunningTime="2026-01-30 08:27:57.769733879 +0000 UTC m=+1116.465280988" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.772668 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vd7q8" event={"ID":"38224a9d-ced6-4f76-8117-18e7ca7f33e7","Type":"ContainerStarted","Data":"3cc3794f576037b7275283832f1fcd12d44b3421b4fb40fee74fe7e2b82882e4"} Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.814004 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:27:57 crc kubenswrapper[4870]: I0130 08:27:57.818387 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vd7q8" podStartSLOduration=3.818370551 podStartE2EDuration="3.818370551s" podCreationTimestamp="2026-01-30 08:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:27:57.809830536 +0000 UTC m=+1116.505377665" watchObservedRunningTime="2026-01-30 08:27:57.818370551 +0000 UTC m=+1116.513917650" Jan 30 08:28:03 crc kubenswrapper[4870]: I0130 08:28:03.843298 4870 generic.go:334] "Generic (PLEG): container finished" podID="38224a9d-ced6-4f76-8117-18e7ca7f33e7" containerID="3cc3794f576037b7275283832f1fcd12d44b3421b4fb40fee74fe7e2b82882e4" exitCode=0 Jan 30 08:28:03 crc kubenswrapper[4870]: I0130 08:28:03.843341 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vd7q8" event={"ID":"38224a9d-ced6-4f76-8117-18e7ca7f33e7","Type":"ContainerDied","Data":"3cc3794f576037b7275283832f1fcd12d44b3421b4fb40fee74fe7e2b82882e4"} Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.305492 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.384017 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.385645 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.388584 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.393000 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446315 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446393 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446485 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446611 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446640 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.446666 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdz8p\" (UniqueName: \"kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.456060 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.488985 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-769d7654db-gw44c"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.490664 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.524037 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769d7654db-gw44c"] Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548244 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548287 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548324 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-logs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548367 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-tls-certs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548397 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-scripts\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548427 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-config-data\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548444 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548466 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8bg8\" (UniqueName: \"kubernetes.io/projected/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-kube-api-access-s8bg8\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548487 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548508 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdz8p\" (UniqueName: \"kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548538 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-combined-ca-bundle\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548586 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.548615 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-secret-key\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.550088 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.551243 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.551749 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.555411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.559889 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.560181 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.571925 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdz8p\" (UniqueName: \"kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p\") pod \"horizon-74569d8966-5sjxs\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.650555 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-scripts\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.651161 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-config-data\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.651296 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-scripts\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.651402 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8bg8\" (UniqueName: \"kubernetes.io/projected/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-kube-api-access-s8bg8\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.651593 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-combined-ca-bundle\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.651648 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-secret-key\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.652246 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-logs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.652322 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-tls-certs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.652555 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-logs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.652860 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-config-data\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.656059 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-combined-ca-bundle\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.656504 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-tls-certs\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.661766 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-horizon-secret-key\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.683971 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8bg8\" (UniqueName: \"kubernetes.io/projected/b6c9337c-50ce-4c5c-a84f-8092d25fa1e2-kube-api-access-s8bg8\") pod \"horizon-769d7654db-gw44c\" (UID: \"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2\") " pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.710162 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:04 crc kubenswrapper[4870]: I0130 08:28:04.815381 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:12 crc kubenswrapper[4870]: E0130 08:28:12.192996 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Jan 30 08:28:12 crc kubenswrapper[4870]: E0130 08:28:12.193505 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Jan 30 08:28:12 crc kubenswrapper[4870]: E0130 08:28:12.193700 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.23:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7n8n8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-tssp8_openstack(edd09a42-14b6-4161-ba2a-82c4cf4f5983): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:28:12 crc kubenswrapper[4870]: E0130 08:28:12.195238 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-tssp8" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" Jan 30 08:28:12 crc kubenswrapper[4870]: E0130 08:28:12.935293 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-tssp8" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.388833 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.395409 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555675 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nf46\" (UniqueName: \"kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555728 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555845 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555932 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555952 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.555991 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556106 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556181 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556214 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556245 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556267 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts\") pod \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\" (UID: \"38224a9d-ced6-4f76-8117-18e7ca7f33e7\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.556301 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s8jg\" (UniqueName: \"kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg\") pod \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\" (UID: \"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8\") " Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.562451 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.563650 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg" (OuterVolumeSpecName: "kube-api-access-8s8jg") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "kube-api-access-8s8jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.565565 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46" (OuterVolumeSpecName: "kube-api-access-2nf46") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "kube-api-access-2nf46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.566061 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts" (OuterVolumeSpecName: "scripts") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.566153 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.580802 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.587158 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.589104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.589938 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.593674 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.600482 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config" (OuterVolumeSpecName: "config") pod "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" (UID: "fc98de4d-b882-4f13-bc7e-1e6070ffd7d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.611106 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data" (OuterVolumeSpecName: "config-data") pod "38224a9d-ced6-4f76-8117-18e7ca7f33e7" (UID: "38224a9d-ced6-4f76-8117-18e7ca7f33e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658430 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s8jg\" (UniqueName: \"kubernetes.io/projected/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-kube-api-access-8s8jg\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658458 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nf46\" (UniqueName: \"kubernetes.io/projected/38224a9d-ced6-4f76-8117-18e7ca7f33e7-kube-api-access-2nf46\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658469 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658479 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658488 4870 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658496 4870 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658504 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658512 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658519 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658527 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658536 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: I0130 08:28:24.658544 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38224a9d-ced6-4f76-8117-18e7ca7f33e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:24 crc kubenswrapper[4870]: E0130 08:28:24.757094 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-watcher-applier:watcher_latest" Jan 30 08:28:24 crc kubenswrapper[4870]: E0130 08:28:24.757167 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-watcher-applier:watcher_latest" Jan 30 08:28:24 crc kubenswrapper[4870]: E0130 08:28:24.757481 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-applier,Image:38.102.83.23:5001/podified-master-centos10/openstack-watcher-applier:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d6h75h596h649h5fhd7hdfhc9h587h5f7h55bhd8hdbh5d4h5dfh567h5c7hcbh5b7h5b7h686h64ch95h5cfh557h654h568h54ch646hdh8bh9fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-applier-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/watcher,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gqd86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pgrep -r DRST watcher-applier],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pgrep -r DRST watcher-applier],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42451,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pgrep -r DRST watcher-applier],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-applier-0_openstack(d501bb9c-d88d-4362-a48e-4d0347ecc90e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:28:24 crc kubenswrapper[4870]: E0130 08:28:24.758999 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-applier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.037789 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" event={"ID":"fc98de4d-b882-4f13-bc7e-1e6070ffd7d8","Type":"ContainerDied","Data":"0596995a387e00d4157105eb8a5a484304961f9369875c777b77a509e3229e35"} Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.038107 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb457dfc5-294j8" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.038415 4870 scope.go:117] "RemoveContainer" containerID="4dee564f51791741ba4e6cb761d557f18b1c1b23b27913388a9dbed54cdf4c9e" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.042200 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vd7q8" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.042989 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vd7q8" event={"ID":"38224a9d-ced6-4f76-8117-18e7ca7f33e7","Type":"ContainerDied","Data":"bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571"} Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.043024 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4fa656174794487020ac5e8e368cdaeecb6ab51935c7a52ce6d58beb8d1571" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.044964 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-applier\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-watcher-applier:watcher_latest\\\"\"" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.106354 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.113522 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bb457dfc5-294j8"] Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.254425 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.254473 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.492356 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vd7q8"] Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.500859 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vd7q8"] Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.600806 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g4m9m"] Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.601248 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" containerName="init" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.601269 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" containerName="init" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.601300 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38224a9d-ced6-4f76-8117-18e7ca7f33e7" containerName="keystone-bootstrap" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.601306 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="38224a9d-ced6-4f76-8117-18e7ca7f33e7" containerName="keystone-bootstrap" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.601463 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" containerName="init" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.601496 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="38224a9d-ced6-4f76-8117-18e7ca7f33e7" containerName="keystone-bootstrap" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.602099 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.605110 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.605326 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.605563 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.605766 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.605947 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vn7b5" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.612014 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g4m9m"] Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.677940 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.678195 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.678258 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.678289 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.678541 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.678620 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782144 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782171 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782233 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782264 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.782431 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.786670 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.788488 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.788700 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.792389 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.793043 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.803506 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr\") pod \"keystone-bootstrap-g4m9m\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: I0130 08:28:25.924059 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.983042 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.983107 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.983272 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.23:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lgl2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9g27p_openstack(685bde78-dea1-4864-a825-af176178bd11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:28:25 crc kubenswrapper[4870]: E0130 08:28:25.984677 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9g27p" podUID="685bde78-dea1-4864-a825-af176178bd11" Jan 30 08:28:26 crc kubenswrapper[4870]: E0130 08:28:26.075300 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-9g27p" podUID="685bde78-dea1-4864-a825-af176178bd11" Jan 30 08:28:26 crc kubenswrapper[4870]: I0130 08:28:26.092547 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38224a9d-ced6-4f76-8117-18e7ca7f33e7" path="/var/lib/kubelet/pods/38224a9d-ced6-4f76-8117-18e7ca7f33e7/volumes" Jan 30 08:28:26 crc kubenswrapper[4870]: I0130 08:28:26.093169 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc98de4d-b882-4f13-bc7e-1e6070ffd7d8" path="/var/lib/kubelet/pods/fc98de4d-b882-4f13-bc7e-1e6070ffd7d8/volumes" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.272263 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-barbican-api/blobs/sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4\": context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.272688 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = reading blob sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-barbican-api/blobs/sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4\": context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.272806 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.23:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bmq48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-d2mx7_openstack(c3bd649e-5c3c-495f-933f-3b516167cbd2): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-barbican-api/blobs/sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4\": context canceled" logger="UnhandledError" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277475 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277508 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277602 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64dh689h676h7fh5bbh58fhb7h574h6bh58fh7dh579hdh5bh669h67fhfh5bfh559hch6bh95h567h597hc9h5fbhc9h98h7dh5d4h84h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7gl4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6d56cb75f7-5b6cr_openstack(547994a2-f3d5-4ac9-a025-2644e86fe00d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277648 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-ceilometer-central/blobs/sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0\": context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277677 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = reading blob sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-ceilometer-central/blobs/sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0\": context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277754 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.23:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h86h65bhdbh5cdh5dfh97h656h68bh548hch567h5b4h685hf7h567h5f8h67dh8dh68h657h75h699h5d9h696hb4h588h99h545hf6h694h67q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdb9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0: Get \"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-ceilometer-central/blobs/sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0\": context canceled" logger="UnhandledError" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.277802 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4: Get \\\"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-barbican-api/blobs/sha256:9297fe5be2ac1cc1fd34b411e74e2fd0c8cfdcfffd9039224ac97aa7f09437b4\\\": context canceled\"" pod="openstack/barbican-db-sync-d2mx7" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.290799 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-6d56cb75f7-5b6cr" podUID="547994a2-f3d5-4ac9-a025-2644e86fe00d" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.321138 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.321462 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.321584 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57bhdch68hbdh584h658h559hf5h8bh689h564hfbh65h6bh5c7hcdh6ch6fh57h584h667h5fbh586hddh56bh694hbdh6hb6hbchc6h5fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnppg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5fb6b548c7-56kg5_openstack(fe9b9169-ab54-46ee-acb5-d1dc0047e59c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:28:32 crc kubenswrapper[4870]: E0130 08:28:32.324449 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-5fb6b548c7-56kg5" podUID="fe9b9169-ab54-46ee-acb5-d1dc0047e59c" Jan 30 08:28:33 crc kubenswrapper[4870]: E0130 08:28:33.136422 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-d2mx7" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.167554 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb6b548c7-56kg5" event={"ID":"fe9b9169-ab54-46ee-acb5-d1dc0047e59c","Type":"ContainerDied","Data":"c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987"} Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.168465 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04be222da792bc780412525fe8f824421a705de698d7617b4310714f4dd2987" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.171824 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d56cb75f7-5b6cr" event={"ID":"547994a2-f3d5-4ac9-a025-2644e86fe00d","Type":"ContainerDied","Data":"3856f7ed6f6f6e5768d9f69766b166a6b372b3a627f11ff19e5e79941f1444f8"} Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.172002 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3856f7ed6f6f6e5768d9f69766b166a6b372b3a627f11ff19e5e79941f1444f8" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.184798 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.219484 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362456 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gl4q\" (UniqueName: \"kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q\") pod \"547994a2-f3d5-4ac9-a025-2644e86fe00d\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362519 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key\") pod \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data\") pod \"547994a2-f3d5-4ac9-a025-2644e86fe00d\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362577 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts\") pod \"547994a2-f3d5-4ac9-a025-2644e86fe00d\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362657 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data\") pod \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362709 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key\") pod \"547994a2-f3d5-4ac9-a025-2644e86fe00d\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362746 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnppg\" (UniqueName: \"kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg\") pod \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362773 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs\") pod \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362802 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts\") pod \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\" (UID: \"fe9b9169-ab54-46ee-acb5-d1dc0047e59c\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.362854 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs\") pod \"547994a2-f3d5-4ac9-a025-2644e86fe00d\" (UID: \"547994a2-f3d5-4ac9-a025-2644e86fe00d\") " Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.363167 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data" (OuterVolumeSpecName: "config-data") pod "547994a2-f3d5-4ac9-a025-2644e86fe00d" (UID: "547994a2-f3d5-4ac9-a025-2644e86fe00d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.364360 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts" (OuterVolumeSpecName: "scripts") pod "547994a2-f3d5-4ac9-a025-2644e86fe00d" (UID: "547994a2-f3d5-4ac9-a025-2644e86fe00d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.364784 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts" (OuterVolumeSpecName: "scripts") pod "fe9b9169-ab54-46ee-acb5-d1dc0047e59c" (UID: "fe9b9169-ab54-46ee-acb5-d1dc0047e59c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.365087 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs" (OuterVolumeSpecName: "logs") pod "fe9b9169-ab54-46ee-acb5-d1dc0047e59c" (UID: "fe9b9169-ab54-46ee-acb5-d1dc0047e59c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.365272 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs" (OuterVolumeSpecName: "logs") pod "547994a2-f3d5-4ac9-a025-2644e86fe00d" (UID: "547994a2-f3d5-4ac9-a025-2644e86fe00d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.365261 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data" (OuterVolumeSpecName: "config-data") pod "fe9b9169-ab54-46ee-acb5-d1dc0047e59c" (UID: "fe9b9169-ab54-46ee-acb5-d1dc0047e59c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.367347 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg" (OuterVolumeSpecName: "kube-api-access-jnppg") pod "fe9b9169-ab54-46ee-acb5-d1dc0047e59c" (UID: "fe9b9169-ab54-46ee-acb5-d1dc0047e59c"). InnerVolumeSpecName "kube-api-access-jnppg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.369033 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fe9b9169-ab54-46ee-acb5-d1dc0047e59c" (UID: "fe9b9169-ab54-46ee-acb5-d1dc0047e59c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.369358 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q" (OuterVolumeSpecName: "kube-api-access-7gl4q") pod "547994a2-f3d5-4ac9-a025-2644e86fe00d" (UID: "547994a2-f3d5-4ac9-a025-2644e86fe00d"). InnerVolumeSpecName "kube-api-access-7gl4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.369637 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "547994a2-f3d5-4ac9-a025-2644e86fe00d" (UID: "547994a2-f3d5-4ac9-a025-2644e86fe00d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.414586 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.426482 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:28:34 crc kubenswrapper[4870]: W0130 08:28:34.430235 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1872a14d_aeff_46f7_8430_c6fe0eb6973b.slice/crio-f7250a53827f362fa55ae4df1436ef860d73a09ab3dfd65756154cbbf24973a7 WatchSource:0}: Error finding container f7250a53827f362fa55ae4df1436ef860d73a09ab3dfd65756154cbbf24973a7: Status 404 returned error can't find the container with id f7250a53827f362fa55ae4df1436ef860d73a09ab3dfd65756154cbbf24973a7 Jan 30 08:28:34 crc kubenswrapper[4870]: W0130 08:28:34.431352 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4171155c_1d8c_48a0_9675_1c730f9130dc.slice/crio-3cd56b7866047c6a3192e1f27cec1f489317b2b2d8b6d0b806475464e27ca26f WatchSource:0}: Error finding container 3cd56b7866047c6a3192e1f27cec1f489317b2b2d8b6d0b806475464e27ca26f: Status 404 returned error can't find the container with id 3cd56b7866047c6a3192e1f27cec1f489317b2b2d8b6d0b806475464e27ca26f Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467058 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467095 4870 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/547994a2-f3d5-4ac9-a025-2644e86fe00d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467105 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnppg\" (UniqueName: \"kubernetes.io/projected/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-kube-api-access-jnppg\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467114 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467123 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467131 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/547994a2-f3d5-4ac9-a025-2644e86fe00d-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467139 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gl4q\" (UniqueName: \"kubernetes.io/projected/547994a2-f3d5-4ac9-a025-2644e86fe00d-kube-api-access-7gl4q\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467149 4870 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe9b9169-ab54-46ee-acb5-d1dc0047e59c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467157 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.467165 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547994a2-f3d5-4ac9-a025-2644e86fe00d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.557073 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769d7654db-gw44c"] Jan 30 08:28:34 crc kubenswrapper[4870]: I0130 08:28:34.610948 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g4m9m"] Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.186664 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769d7654db-gw44c" event={"ID":"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2","Type":"ContainerStarted","Data":"cfc50cfb0483cb8c170c6df31ba4eb709bb2e1c2c8750dc908b1c4114238e8fc"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.187039 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769d7654db-gw44c" event={"ID":"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2","Type":"ContainerStarted","Data":"97d8f52cba6eb7fd327935fabe36ebafa5dc36db888e1cb7a1e601511d9a23b2"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.192424 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g4m9m" event={"ID":"b9b91a69-f8ad-4d1d-a47d-c1921071c71a","Type":"ContainerStarted","Data":"091f6d41669e606ed42188e0c975f67619382a546808d176937498d135759acb"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.193911 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f","Type":"ContainerStarted","Data":"5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.196675 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerStarted","Data":"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.196716 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerStarted","Data":"f7250a53827f362fa55ae4df1436ef860d73a09ab3dfd65756154cbbf24973a7"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.203626 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerStarted","Data":"57f324a7af1ed982422d86d531cc1073fc7e06667530fd2daf47081062016e35"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.203674 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerStarted","Data":"3cd56b7866047c6a3192e1f27cec1f489317b2b2d8b6d0b806475464e27ca26f"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.205846 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" event={"ID":"7fc2a1f3-54bc-4554-a413-69bc35b58a2f","Type":"ContainerStarted","Data":"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.206776 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.208719 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerStarted","Data":"2440043bd103b6c3935a639d98298276ce819674a69c89d3a193f378017291b1"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.208832 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api-log" containerID="cri-o://4613e64cbab9e5f1e4c63d6f62c12ca1aa12d4d56ce06a57b13b9b4dcc74559f" gracePeriod=30 Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.210993 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" containerID="cri-o://2440043bd103b6c3935a639d98298276ce819674a69c89d3a193f378017291b1" gracePeriod=30 Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.211060 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.217275 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d56cb75f7-5b6cr" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.223237 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": EOF" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.226466 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b57k5" event={"ID":"1435e0c6-e24a-44d4-bf78-3e5300e23cdd","Type":"ContainerStarted","Data":"ac1cfe0654d6d9f59d0d7bba982a578597204c7a7dcbab5f91122bf878031c77"} Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.227223 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb6b548c7-56kg5" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.234203 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=5.347058933 podStartE2EDuration="41.234179717s" podCreationTimestamp="2026-01-30 08:27:54 +0000 UTC" firstStartedPulling="2026-01-30 08:27:56.385979362 +0000 UTC m=+1115.081526461" lastFinishedPulling="2026-01-30 08:28:32.273100126 +0000 UTC m=+1150.968647245" observedRunningTime="2026-01-30 08:28:35.226845269 +0000 UTC m=+1153.922392378" watchObservedRunningTime="2026-01-30 08:28:35.234179717 +0000 UTC m=+1153.929726836" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.256068 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-b57k5" podStartSLOduration=3.255683937 podStartE2EDuration="40.256045817s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="2026-01-30 08:27:57.045217666 +0000 UTC m=+1115.740764775" lastFinishedPulling="2026-01-30 08:28:34.045579546 +0000 UTC m=+1152.741126655" observedRunningTime="2026-01-30 08:28:35.243464346 +0000 UTC m=+1153.939011465" watchObservedRunningTime="2026-01-30 08:28:35.256045817 +0000 UTC m=+1153.951592946" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.279684 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" podStartSLOduration=40.279667391 podStartE2EDuration="40.279667391s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:35.275372117 +0000 UTC m=+1153.970919226" watchObservedRunningTime="2026-01-30 08:28:35.279667391 +0000 UTC m=+1153.975214500" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.299495 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=40.299480387 podStartE2EDuration="40.299480387s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:35.294293955 +0000 UTC m=+1153.989841064" watchObservedRunningTime="2026-01-30 08:28:35.299480387 +0000 UTC m=+1153.995027496" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.388277 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.406379 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fb6b548c7-56kg5"] Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.421594 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.430131 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d56cb75f7-5b6cr"] Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.651184 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.654077 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:28:35 crc kubenswrapper[4870]: I0130 08:28:35.684556 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.092836 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="547994a2-f3d5-4ac9-a025-2644e86fe00d" path="/var/lib/kubelet/pods/547994a2-f3d5-4ac9-a025-2644e86fe00d/volumes" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.094303 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9b9169-ab54-46ee-acb5-d1dc0047e59c" path="/var/lib/kubelet/pods/fe9b9169-ab54-46ee-acb5-d1dc0047e59c/volumes" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.232834 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerStarted","Data":"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.239270 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerStarted","Data":"8001ca067558561639146186888f3fefa9a3f66b8cfe6da27c20754262532feb"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.239422 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5949fbc84f-vdxjp" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon-log" containerID="cri-o://57f324a7af1ed982422d86d531cc1073fc7e06667530fd2daf47081062016e35" gracePeriod=30 Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.239478 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5949fbc84f-vdxjp" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon" containerID="cri-o://8001ca067558561639146186888f3fefa9a3f66b8cfe6da27c20754262532feb" gracePeriod=30 Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.267902 4870 generic.go:334] "Generic (PLEG): container finished" podID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerID="4613e64cbab9e5f1e4c63d6f62c12ca1aa12d4d56ce06a57b13b9b4dcc74559f" exitCode=143 Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.267983 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerDied","Data":"4613e64cbab9e5f1e4c63d6f62c12ca1aa12d4d56ce06a57b13b9b4dcc74559f"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.318819 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5949fbc84f-vdxjp" podStartSLOduration=39.159902825 podStartE2EDuration="39.318800955s" podCreationTimestamp="2026-01-30 08:27:57 +0000 UTC" firstStartedPulling="2026-01-30 08:28:34.434012482 +0000 UTC m=+1153.129559581" lastFinishedPulling="2026-01-30 08:28:34.592910602 +0000 UTC m=+1153.288457711" observedRunningTime="2026-01-30 08:28:36.310566458 +0000 UTC m=+1155.006113567" watchObservedRunningTime="2026-01-30 08:28:36.318800955 +0000 UTC m=+1155.014348064" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.322305 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769d7654db-gw44c" event={"ID":"b6c9337c-50ce-4c5c-a84f-8092d25fa1e2","Type":"ContainerStarted","Data":"90c555109c0c2fc282c1e031c1ed41dac3b5e5d7868f70671a0deb315bc10fb6"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.326108 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g4m9m" event={"ID":"b9b91a69-f8ad-4d1d-a47d-c1921071c71a","Type":"ContainerStarted","Data":"423e4f8207599a836d08eca85be2c21680c69e731edaed6ac9d59c605d325bfb"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.326890 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74569d8966-5sjxs" podStartSLOduration=32.170821835 podStartE2EDuration="32.326863096s" podCreationTimestamp="2026-01-30 08:28:04 +0000 UTC" firstStartedPulling="2026-01-30 08:28:34.434295881 +0000 UTC m=+1153.129842990" lastFinishedPulling="2026-01-30 08:28:34.590337142 +0000 UTC m=+1153.285884251" observedRunningTime="2026-01-30 08:28:36.2581884 +0000 UTC m=+1154.953735509" watchObservedRunningTime="2026-01-30 08:28:36.326863096 +0000 UTC m=+1155.022410205" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.343786 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tssp8" event={"ID":"edd09a42-14b6-4161-ba2a-82c4cf4f5983","Type":"ContainerStarted","Data":"85f1049088e388e69d6da33f4eab9143943bc4d4ba2179d9093657152d474310"} Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.345040 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.366658 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-769d7654db-gw44c" podStartSLOduration=32.366639391 podStartE2EDuration="32.366639391s" podCreationTimestamp="2026-01-30 08:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:36.352585014 +0000 UTC m=+1155.048132123" watchObservedRunningTime="2026-01-30 08:28:36.366639391 +0000 UTC m=+1155.062186500" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.416731 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.421508 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tssp8" podStartSLOduration=4.644066308 podStartE2EDuration="48.421491697s" podCreationTimestamp="2026-01-30 08:27:48 +0000 UTC" firstStartedPulling="2026-01-30 08:27:50.273078351 +0000 UTC m=+1108.968625460" lastFinishedPulling="2026-01-30 08:28:34.05050374 +0000 UTC m=+1152.746050849" observedRunningTime="2026-01-30 08:28:36.411990391 +0000 UTC m=+1155.107537510" watchObservedRunningTime="2026-01-30 08:28:36.421491697 +0000 UTC m=+1155.117038806" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.422083 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g4m9m" podStartSLOduration=11.422077786 podStartE2EDuration="11.422077786s" podCreationTimestamp="2026-01-30 08:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:36.391247347 +0000 UTC m=+1155.086794466" watchObservedRunningTime="2026-01-30 08:28:36.422077786 +0000 UTC m=+1155.117624895" Jan 30 08:28:36 crc kubenswrapper[4870]: I0130 08:28:36.477848 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:37 crc kubenswrapper[4870]: I0130 08:28:37.814793 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:28:38 crc kubenswrapper[4870]: I0130 08:28:38.360334 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerName="watcher-decision-engine" containerID="cri-o://5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" gracePeriod=30 Jan 30 08:28:38 crc kubenswrapper[4870]: I0130 08:28:38.967548 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": read tcp 10.217.0.2:50244->10.217.0.152:9322: read: connection reset by peer" Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.371273 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerStarted","Data":"b45a58a1e3e4865b397313616e6494da5d8e1887dd9401a657303e526b984274"} Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.373166 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9g27p" event={"ID":"685bde78-dea1-4864-a825-af176178bd11","Type":"ContainerStarted","Data":"c723dc182803022ba9e618ac6407cbccb617a7c5a0a43457386f580c7a154614"} Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.377227 4870 generic.go:334] "Generic (PLEG): container finished" podID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerID="2440043bd103b6c3935a639d98298276ce819674a69c89d3a193f378017291b1" exitCode=0 Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.377266 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerDied","Data":"2440043bd103b6c3935a639d98298276ce819674a69c89d3a193f378017291b1"} Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.409936 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9g27p" podStartSLOduration=2.847636672 podStartE2EDuration="44.409917018s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="2026-01-30 08:27:56.700961035 +0000 UTC m=+1115.396508144" lastFinishedPulling="2026-01-30 08:28:38.263241371 +0000 UTC m=+1156.958788490" observedRunningTime="2026-01-30 08:28:39.400682401 +0000 UTC m=+1158.096229510" watchObservedRunningTime="2026-01-30 08:28:39.409917018 +0000 UTC m=+1158.105464137" Jan 30 08:28:39 crc kubenswrapper[4870]: I0130 08:28:39.899012 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.013330 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs\") pod \"3be4280c-f244-49ee-8731-bf39ac51ee1e\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.013375 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca\") pod \"3be4280c-f244-49ee-8731-bf39ac51ee1e\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.014310 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs" (OuterVolumeSpecName: "logs") pod "3be4280c-f244-49ee-8731-bf39ac51ee1e" (UID: "3be4280c-f244-49ee-8731-bf39ac51ee1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.014349 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle\") pod \"3be4280c-f244-49ee-8731-bf39ac51ee1e\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.014499 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data\") pod \"3be4280c-f244-49ee-8731-bf39ac51ee1e\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.014549 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gsr9\" (UniqueName: \"kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9\") pod \"3be4280c-f244-49ee-8731-bf39ac51ee1e\" (UID: \"3be4280c-f244-49ee-8731-bf39ac51ee1e\") " Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.014941 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be4280c-f244-49ee-8731-bf39ac51ee1e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.019855 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9" (OuterVolumeSpecName: "kube-api-access-2gsr9") pod "3be4280c-f244-49ee-8731-bf39ac51ee1e" (UID: "3be4280c-f244-49ee-8731-bf39ac51ee1e"). InnerVolumeSpecName "kube-api-access-2gsr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.046979 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be4280c-f244-49ee-8731-bf39ac51ee1e" (UID: "3be4280c-f244-49ee-8731-bf39ac51ee1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.050509 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3be4280c-f244-49ee-8731-bf39ac51ee1e" (UID: "3be4280c-f244-49ee-8731-bf39ac51ee1e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.091819 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data" (OuterVolumeSpecName: "config-data") pod "3be4280c-f244-49ee-8731-bf39ac51ee1e" (UID: "3be4280c-f244-49ee-8731-bf39ac51ee1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.116917 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.116946 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.116955 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gsr9\" (UniqueName: \"kubernetes.io/projected/3be4280c-f244-49ee-8731-bf39ac51ee1e-kube-api-access-2gsr9\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.116966 4870 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3be4280c-f244-49ee-8731-bf39ac51ee1e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.409213 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3be4280c-f244-49ee-8731-bf39ac51ee1e","Type":"ContainerDied","Data":"ce3f73b3878e9503e479cb8deefa9a49d72c579fcd3b7d49136ba600b5e48a5d"} Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.409268 4870 scope.go:117] "RemoveContainer" containerID="2440043bd103b6c3935a639d98298276ce819674a69c89d3a193f378017291b1" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.409417 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.422812 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d501bb9c-d88d-4362-a48e-4d0347ecc90e","Type":"ContainerStarted","Data":"ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc"} Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.430476 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.453737 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.487424 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:40 crc kubenswrapper[4870]: E0130 08:28:40.487905 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.487929 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" Jan 30 08:28:40 crc kubenswrapper[4870]: E0130 08:28:40.487951 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api-log" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.487959 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api-log" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.488512 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.488543 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" containerName="watcher-api-log" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.489692 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.493071 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.502192 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.319949566 podStartE2EDuration="46.502171953s" podCreationTimestamp="2026-01-30 08:27:54 +0000 UTC" firstStartedPulling="2026-01-30 08:27:56.279812662 +0000 UTC m=+1114.975359771" lastFinishedPulling="2026-01-30 08:28:39.462035049 +0000 UTC m=+1158.157582158" observedRunningTime="2026-01-30 08:28:40.450324602 +0000 UTC m=+1159.145871731" watchObservedRunningTime="2026-01-30 08:28:40.502171953 +0000 UTC m=+1159.197719072" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.502691 4870 scope.go:117] "RemoveContainer" containerID="4613e64cbab9e5f1e4c63d6f62c12ca1aa12d4d56ce06a57b13b9b4dcc74559f" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.520827 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.527056 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.527129 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkg6n\" (UniqueName: \"kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.527163 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.527266 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.527356 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.576959 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629150 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629188 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkg6n\" (UniqueName: \"kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629228 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629298 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.629543 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.639689 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.641581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.642523 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.702785 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkg6n\" (UniqueName: \"kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n\") pod \"watcher-api-0\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.820912 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.898102 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.965007 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:28:40 crc kubenswrapper[4870]: I0130 08:28:40.965240 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="dnsmasq-dns" containerID="cri-o://72ef45c7c24f8c5fc6788a5862241b333000ac3fd13dcb6350ad2553d12d13f8" gracePeriod=10 Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.432000 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.456371 4870 generic.go:334] "Generic (PLEG): container finished" podID="b9b91a69-f8ad-4d1d-a47d-c1921071c71a" containerID="423e4f8207599a836d08eca85be2c21680c69e731edaed6ac9d59c605d325bfb" exitCode=0 Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.456475 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g4m9m" event={"ID":"b9b91a69-f8ad-4d1d-a47d-c1921071c71a","Type":"ContainerDied","Data":"423e4f8207599a836d08eca85be2c21680c69e731edaed6ac9d59c605d325bfb"} Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.476272 4870 generic.go:334] "Generic (PLEG): container finished" podID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerID="72ef45c7c24f8c5fc6788a5862241b333000ac3fd13dcb6350ad2553d12d13f8" exitCode=0 Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.477037 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" event={"ID":"c409417e-6b71-491c-b7c5-bf1a2b63baed","Type":"ContainerDied","Data":"72ef45c7c24f8c5fc6788a5862241b333000ac3fd13dcb6350ad2553d12d13f8"} Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.659541 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.761423 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.761464 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mpz4\" (UniqueName: \"kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.761525 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.761560 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.761637 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.762345 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc\") pod \"c409417e-6b71-491c-b7c5-bf1a2b63baed\" (UID: \"c409417e-6b71-491c-b7c5-bf1a2b63baed\") " Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.784510 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4" (OuterVolumeSpecName: "kube-api-access-5mpz4") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "kube-api-access-5mpz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.826888 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.838371 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.847321 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config" (OuterVolumeSpecName: "config") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.859129 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.862394 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c409417e-6b71-491c-b7c5-bf1a2b63baed" (UID: "c409417e-6b71-491c-b7c5-bf1a2b63baed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863845 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863883 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mpz4\" (UniqueName: \"kubernetes.io/projected/c409417e-6b71-491c-b7c5-bf1a2b63baed-kube-api-access-5mpz4\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863896 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863908 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863919 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:41 crc kubenswrapper[4870]: I0130 08:28:41.863928 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c409417e-6b71-491c-b7c5-bf1a2b63baed-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.093807 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be4280c-f244-49ee-8731-bf39ac51ee1e" path="/var/lib/kubelet/pods/3be4280c-f244-49ee-8731-bf39ac51ee1e/volumes" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.497430 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" event={"ID":"c409417e-6b71-491c-b7c5-bf1a2b63baed","Type":"ContainerDied","Data":"27e7324b5df73e3d0d4ead3bf3867e6cd08ef4e1f33f8795d331a5b682f586af"} Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.497702 4870 scope.go:117] "RemoveContainer" containerID="72ef45c7c24f8c5fc6788a5862241b333000ac3fd13dcb6350ad2553d12d13f8" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.497470 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cc9679f-wq2nt" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.508252 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerStarted","Data":"be9af03cdaac6d198c36c22f6a72da93c6a8876d7522a083b84ad37b4c4205a3"} Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.508300 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.508311 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerStarted","Data":"2f2cc7b48ae21cc95e0db1fb5e108ac03cb5f4ea23904284d0731df4d6012673"} Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.508320 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerStarted","Data":"e929fb45514c18c709b7cf772f7d4133121a7d17d636aed37d87cc9bcf50a23c"} Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.525897 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.532762 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757cc9679f-wq2nt"] Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.538316 4870 scope.go:117] "RemoveContainer" containerID="0bc8a7090e4dbce528560fe9634da361240a61440f10f7e3c579fda9915e352a" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.541632 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.541613354 podStartE2EDuration="2.541613354s" podCreationTimestamp="2026-01-30 08:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:42.539316332 +0000 UTC m=+1161.234863451" watchObservedRunningTime="2026-01-30 08:28:42.541613354 +0000 UTC m=+1161.237160463" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.895331 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.984949 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.985296 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.985317 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.985338 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.985370 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.985398 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data\") pod \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\" (UID: \"b9b91a69-f8ad-4d1d-a47d-c1921071c71a\") " Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.993226 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:42 crc kubenswrapper[4870]: I0130 08:28:42.993250 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr" (OuterVolumeSpecName: "kube-api-access-t66xr") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "kube-api-access-t66xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.006596 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.006926 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts" (OuterVolumeSpecName: "scripts") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.014004 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.014577 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data" (OuterVolumeSpecName: "config-data") pod "b9b91a69-f8ad-4d1d-a47d-c1921071c71a" (UID: "b9b91a69-f8ad-4d1d-a47d-c1921071c71a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087023 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087061 4870 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087074 4870 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087087 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-kube-api-access-t66xr\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087099 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.087109 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9b91a69-f8ad-4d1d-a47d-c1921071c71a-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.526339 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g4m9m" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.526339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g4m9m" event={"ID":"b9b91a69-f8ad-4d1d-a47d-c1921071c71a","Type":"ContainerDied","Data":"091f6d41669e606ed42188e0c975f67619382a546808d176937498d135759acb"} Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.526385 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="091f6d41669e606ed42188e0c975f67619382a546808d176937498d135759acb" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.733918 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55b585f57f-9h2lg"] Jan 30 08:28:43 crc kubenswrapper[4870]: E0130 08:28:43.734440 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b91a69-f8ad-4d1d-a47d-c1921071c71a" containerName="keystone-bootstrap" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.734461 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b91a69-f8ad-4d1d-a47d-c1921071c71a" containerName="keystone-bootstrap" Jan 30 08:28:43 crc kubenswrapper[4870]: E0130 08:28:43.734475 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="init" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.734483 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="init" Jan 30 08:28:43 crc kubenswrapper[4870]: E0130 08:28:43.734507 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="dnsmasq-dns" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.734516 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="dnsmasq-dns" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.734744 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" containerName="dnsmasq-dns" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.734759 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b91a69-f8ad-4d1d-a47d-c1921071c71a" containerName="keystone-bootstrap" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.735653 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.741430 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.741460 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.741669 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.741780 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vn7b5" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.745927 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.746258 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.746586 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55b585f57f-9h2lg"] Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800383 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-combined-ca-bundle\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800451 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-internal-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800571 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-scripts\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800643 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-public-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800685 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dpph\" (UniqueName: \"kubernetes.io/projected/cb9f4cfa-0698-47dd-9319-47b185d2f937-kube-api-access-6dpph\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800716 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-config-data\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800742 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-credential-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.800775 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-fernet-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902328 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-public-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902385 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dpph\" (UniqueName: \"kubernetes.io/projected/cb9f4cfa-0698-47dd-9319-47b185d2f937-kube-api-access-6dpph\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902414 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-config-data\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902436 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-credential-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902461 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-fernet-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.902492 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-combined-ca-bundle\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.903427 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-internal-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.903527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-scripts\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.907918 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-internal-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.908381 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-fernet-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.908421 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-credential-keys\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.909319 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-public-tls-certs\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.909559 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-config-data\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.910048 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-scripts\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.910113 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9f4cfa-0698-47dd-9319-47b185d2f937-combined-ca-bundle\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:43 crc kubenswrapper[4870]: I0130 08:28:43.943052 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dpph\" (UniqueName: \"kubernetes.io/projected/cb9f4cfa-0698-47dd-9319-47b185d2f937-kube-api-access-6dpph\") pod \"keystone-55b585f57f-9h2lg\" (UID: \"cb9f4cfa-0698-47dd-9319-47b185d2f937\") " pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.060299 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.086863 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c409417e-6b71-491c-b7c5-bf1a2b63baed" path="/var/lib/kubelet/pods/c409417e-6b71-491c-b7c5-bf1a2b63baed/volumes" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.650401 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55b585f57f-9h2lg"] Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.714584 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.714947 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.815755 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:44 crc kubenswrapper[4870]: I0130 08:28:44.816655 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.520550 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.528521 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.547899 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.555948 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55b585f57f-9h2lg" event={"ID":"cb9f4cfa-0698-47dd-9319-47b185d2f937","Type":"ContainerStarted","Data":"4324809eb746d26c074eeb09c271b7fa56245b6d95f28298ee1ac9c6cd8c371f"} Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.588678 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.631493 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:28:45 crc kubenswrapper[4870]: I0130 08:28:45.821777 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:28:47 crc kubenswrapper[4870]: I0130 08:28:47.570706 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" containerID="cri-o://ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" gracePeriod=30 Jan 30 08:28:50 crc kubenswrapper[4870]: E0130 08:28:50.523369 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:50 crc kubenswrapper[4870]: E0130 08:28:50.525388 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:50 crc kubenswrapper[4870]: E0130 08:28:50.526676 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:50 crc kubenswrapper[4870]: E0130 08:28:50.526734 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:28:50 crc kubenswrapper[4870]: I0130 08:28:50.604386 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55b585f57f-9h2lg" event={"ID":"cb9f4cfa-0698-47dd-9319-47b185d2f937","Type":"ContainerStarted","Data":"e8483b8d3774b1165f5fdf771316420579c648c24476ba3c43ca752d0dec0955"} Jan 30 08:28:50 crc kubenswrapper[4870]: I0130 08:28:50.604898 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:28:50 crc kubenswrapper[4870]: I0130 08:28:50.623582 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55b585f57f-9h2lg" podStartSLOduration=7.623564484 podStartE2EDuration="7.623564484s" podCreationTimestamp="2026-01-30 08:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:50.62188745 +0000 UTC m=+1169.317434569" watchObservedRunningTime="2026-01-30 08:28:50.623564484 +0000 UTC m=+1169.319111593" Jan 30 08:28:50 crc kubenswrapper[4870]: I0130 08:28:50.821971 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 30 08:28:50 crc kubenswrapper[4870]: I0130 08:28:50.825484 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 30 08:28:51 crc kubenswrapper[4870]: I0130 08:28:51.622700 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 08:28:52 crc kubenswrapper[4870]: I0130 08:28:52.640511 4870 generic.go:334] "Generic (PLEG): container finished" podID="1435e0c6-e24a-44d4-bf78-3e5300e23cdd" containerID="ac1cfe0654d6d9f59d0d7bba982a578597204c7a7dcbab5f91122bf878031c77" exitCode=0 Jan 30 08:28:52 crc kubenswrapper[4870]: I0130 08:28:52.640579 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b57k5" event={"ID":"1435e0c6-e24a-44d4-bf78-3e5300e23cdd","Type":"ContainerDied","Data":"ac1cfe0654d6d9f59d0d7bba982a578597204c7a7dcbab5f91122bf878031c77"} Jan 30 08:28:52 crc kubenswrapper[4870]: I0130 08:28:52.642749 4870 generic.go:334] "Generic (PLEG): container finished" podID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerID="5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" exitCode=1 Jan 30 08:28:52 crc kubenswrapper[4870]: I0130 08:28:52.642798 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f","Type":"ContainerDied","Data":"5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24"} Jan 30 08:28:54 crc kubenswrapper[4870]: I0130 08:28:54.510587 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:54 crc kubenswrapper[4870]: I0130 08:28:54.516421 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api-log" containerID="cri-o://2f2cc7b48ae21cc95e0db1fb5e108ac03cb5f4ea23904284d0731df4d6012673" gracePeriod=30 Jan 30 08:28:54 crc kubenswrapper[4870]: I0130 08:28:54.516578 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api" containerID="cri-o://be9af03cdaac6d198c36c22f6a72da93c6a8876d7522a083b84ad37b4c4205a3" gracePeriod=30 Jan 30 08:28:54 crc kubenswrapper[4870]: I0130 08:28:54.659938 4870 generic.go:334] "Generic (PLEG): container finished" podID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerID="2f2cc7b48ae21cc95e0db1fb5e108ac03cb5f4ea23904284d0731df4d6012673" exitCode=143 Jan 30 08:28:54 crc kubenswrapper[4870]: I0130 08:28:54.659966 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerDied","Data":"2f2cc7b48ae21cc95e0db1fb5e108ac03cb5f4ea23904284d0731df4d6012673"} Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.250246 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.250495 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.250533 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.251201 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.251253 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae" gracePeriod=600 Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.521990 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.523579 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.525034 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.525096 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.652543 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24 is running failed: container process not found" containerID="5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.654267 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24 is running failed: container process not found" containerID="5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.654606 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24 is running failed: container process not found" containerID="5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Jan 30 08:28:55 crc kubenswrapper[4870]: E0130 08:28:55.654674 4870 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerName="watcher-decision-engine" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.669016 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae" exitCode=0 Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.669057 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae"} Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.669091 4870 scope.go:117] "RemoveContainer" containerID="902c9bb8b96922377d8d6da6fb79b392e9bc4a710daf7c3c1a77d5b9c2b536ac" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.821614 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9322/\": dial tcp 10.217.0.165:9322: connect: connection refused" Jan 30 08:28:55 crc kubenswrapper[4870]: I0130 08:28:55.821679 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.165:9322/\": dial tcp 10.217.0.165:9322: connect: connection refused" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.689383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b57k5" event={"ID":"1435e0c6-e24a-44d4-bf78-3e5300e23cdd","Type":"ContainerDied","Data":"09157ba4d27c88f05b3bcbf87559e2f7cd18de58cca7f08d086b19254b605ef0"} Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.689824 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09157ba4d27c88f05b3bcbf87559e2f7cd18de58cca7f08d086b19254b605ef0" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.692204 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f","Type":"ContainerDied","Data":"688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338"} Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.692242 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="688a235f4ea3f6827668ea12cec3c801c91b77e9c743386ea44f9613e759f338" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.694964 4870 generic.go:334] "Generic (PLEG): container finished" podID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerID="be9af03cdaac6d198c36c22f6a72da93c6a8876d7522a083b84ad37b4c4205a3" exitCode=0 Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.694999 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerDied","Data":"be9af03cdaac6d198c36c22f6a72da93c6a8876d7522a083b84ad37b4c4205a3"} Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.736045 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.791833 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b57k5" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.818464 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle\") pod \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.818596 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sskqz\" (UniqueName: \"kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz\") pod \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.818633 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data\") pod \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.818666 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs\") pod \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.818740 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca\") pod \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\" (UID: \"8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.825600 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs" (OuterVolumeSpecName: "logs") pod "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" (UID: "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.864303 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz" (OuterVolumeSpecName: "kube-api-access-sskqz") pod "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" (UID: "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f"). InnerVolumeSpecName "kube-api-access-sskqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.919848 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle\") pod \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.920248 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts\") pod \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.920410 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2pwt\" (UniqueName: \"kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt\") pod \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.920450 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data\") pod \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.920611 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs\") pod \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\" (UID: \"1435e0c6-e24a-44d4-bf78-3e5300e23cdd\") " Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.921105 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sskqz\" (UniqueName: \"kubernetes.io/projected/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-kube-api-access-sskqz\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.921130 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.921488 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs" (OuterVolumeSpecName: "logs") pod "1435e0c6-e24a-44d4-bf78-3e5300e23cdd" (UID: "1435e0c6-e24a-44d4-bf78-3e5300e23cdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.929341 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt" (OuterVolumeSpecName: "kube-api-access-n2pwt") pod "1435e0c6-e24a-44d4-bf78-3e5300e23cdd" (UID: "1435e0c6-e24a-44d4-bf78-3e5300e23cdd"). InnerVolumeSpecName "kube-api-access-n2pwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.933015 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts" (OuterVolumeSpecName: "scripts") pod "1435e0c6-e24a-44d4-bf78-3e5300e23cdd" (UID: "1435e0c6-e24a-44d4-bf78-3e5300e23cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.950059 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" (UID: "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.953770 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" (UID: "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.978474 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data" (OuterVolumeSpecName: "config-data") pod "1435e0c6-e24a-44d4-bf78-3e5300e23cdd" (UID: "1435e0c6-e24a-44d4-bf78-3e5300e23cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.978583 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1435e0c6-e24a-44d4-bf78-3e5300e23cdd" (UID: "1435e0c6-e24a-44d4-bf78-3e5300e23cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:56 crc kubenswrapper[4870]: I0130 08:28:56.996766 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data" (OuterVolumeSpecName: "config-data") pod "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" (UID: "8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.000892 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022668 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022712 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022729 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022741 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022752 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022764 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2pwt\" (UniqueName: \"kubernetes.io/projected/1435e0c6-e24a-44d4-bf78-3e5300e23cdd-kube-api-access-n2pwt\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022779 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.022794 4870 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.128455 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkg6n\" (UniqueName: \"kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n\") pod \"b63b26d0-7049-490c-97dc-117bbbf5fa01\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.128559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs\") pod \"b63b26d0-7049-490c-97dc-117bbbf5fa01\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.128598 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle\") pod \"b63b26d0-7049-490c-97dc-117bbbf5fa01\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.128686 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca\") pod \"b63b26d0-7049-490c-97dc-117bbbf5fa01\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.128777 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data\") pod \"b63b26d0-7049-490c-97dc-117bbbf5fa01\" (UID: \"b63b26d0-7049-490c-97dc-117bbbf5fa01\") " Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.129064 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs" (OuterVolumeSpecName: "logs") pod "b63b26d0-7049-490c-97dc-117bbbf5fa01" (UID: "b63b26d0-7049-490c-97dc-117bbbf5fa01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.129432 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63b26d0-7049-490c-97dc-117bbbf5fa01-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.136143 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n" (OuterVolumeSpecName: "kube-api-access-kkg6n") pod "b63b26d0-7049-490c-97dc-117bbbf5fa01" (UID: "b63b26d0-7049-490c-97dc-117bbbf5fa01"). InnerVolumeSpecName "kube-api-access-kkg6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.155997 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63b26d0-7049-490c-97dc-117bbbf5fa01" (UID: "b63b26d0-7049-490c-97dc-117bbbf5fa01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.171967 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b63b26d0-7049-490c-97dc-117bbbf5fa01" (UID: "b63b26d0-7049-490c-97dc-117bbbf5fa01"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.178671 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data" (OuterVolumeSpecName: "config-data") pod "b63b26d0-7049-490c-97dc-117bbbf5fa01" (UID: "b63b26d0-7049-490c-97dc-117bbbf5fa01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.231642 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.231685 4870 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.231712 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63b26d0-7049-490c-97dc-117bbbf5fa01-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.231729 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkg6n\" (UniqueName: \"kubernetes.io/projected/b63b26d0-7049-490c-97dc-117bbbf5fa01-kube-api-access-kkg6n\") on node \"crc\" DevicePath \"\"" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.240065 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.253235 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.709652 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerStarted","Data":"eb3296110237841074b9a042e7cdf380a70fa9819f62258f821b5124b40eb835"} Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.713609 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d2mx7" event={"ID":"c3bd649e-5c3c-495f-933f-3b516167cbd2","Type":"ContainerStarted","Data":"3e1279140ba8261786354ed9fbacfe2a2a43a2b8decaba7ca7c7b15754ed7ff9"} Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.731683 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844"} Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.735744 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-d2mx7" podStartSLOduration=3.315431998 podStartE2EDuration="1m2.73572564s" podCreationTimestamp="2026-01-30 08:27:55 +0000 UTC" firstStartedPulling="2026-01-30 08:27:57.068697007 +0000 UTC m=+1115.764244116" lastFinishedPulling="2026-01-30 08:28:56.488990629 +0000 UTC m=+1175.184537758" observedRunningTime="2026-01-30 08:28:57.731747755 +0000 UTC m=+1176.427294874" watchObservedRunningTime="2026-01-30 08:28:57.73572564 +0000 UTC m=+1176.431272749" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.738397 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b57k5" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.740231 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.741446 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.741453 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b63b26d0-7049-490c-97dc-117bbbf5fa01","Type":"ContainerDied","Data":"e929fb45514c18c709b7cf772f7d4133121a7d17d636aed37d87cc9bcf50a23c"} Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.741519 4870 scope.go:117] "RemoveContainer" containerID="be9af03cdaac6d198c36c22f6a72da93c6a8876d7522a083b84ad37b4c4205a3" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.768391 4870 scope.go:117] "RemoveContainer" containerID="2f2cc7b48ae21cc95e0db1fb5e108ac03cb5f4ea23904284d0731df4d6012673" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.802925 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.816716 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.829922 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.852380 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863311 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: E0130 08:28:57.863690 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1435e0c6-e24a-44d4-bf78-3e5300e23cdd" containerName="placement-db-sync" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863705 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1435e0c6-e24a-44d4-bf78-3e5300e23cdd" containerName="placement-db-sync" Jan 30 08:28:57 crc kubenswrapper[4870]: E0130 08:28:57.863739 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863745 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api" Jan 30 08:28:57 crc kubenswrapper[4870]: E0130 08:28:57.863757 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api-log" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863764 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api-log" Jan 30 08:28:57 crc kubenswrapper[4870]: E0130 08:28:57.863787 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerName="watcher-decision-engine" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863794 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerName="watcher-decision-engine" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863971 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" containerName="watcher-decision-engine" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863984 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1435e0c6-e24a-44d4-bf78-3e5300e23cdd" containerName="placement-db-sync" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.863991 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api-log" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.864007 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" containerName="watcher-api" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.864575 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.885774 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.889936 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.910235 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.911907 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.931143 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.931311 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.936130 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.954216 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.954277 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.954351 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.954377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9h65\" (UniqueName: \"kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:57 crc kubenswrapper[4870]: I0130 08:28:57.954414 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.002945 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064771 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064837 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064911 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064952 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064974 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964cd6aa-bebd-412e-bd1c-001d151a90e8-logs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.064994 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9h65\" (UniqueName: \"kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065021 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l777x\" (UniqueName: \"kubernetes.io/projected/964cd6aa-bebd-412e-bd1c-001d151a90e8-kube-api-access-l777x\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065040 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065058 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-config-data\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065076 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065112 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.065128 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-public-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.075527 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.075620 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.077023 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.080086 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.100090 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f" path="/var/lib/kubelet/pods/8eb0ea94-f1b1-41c4-a968-ff1d4af60e2f/volumes" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.101190 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63b26d0-7049-490c-97dc-117bbbf5fa01" path="/var/lib/kubelet/pods/b63b26d0-7049-490c-97dc-117bbbf5fa01/volumes" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.101825 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9h65\" (UniqueName: \"kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65\") pod \"watcher-decision-engine-0\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.102176 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56cfc8cc98-pfz9w"] Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.103571 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.107699 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.107940 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-skpxp" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.112479 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.112576 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.113281 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.119581 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56cfc8cc98-pfz9w"] Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.216709 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.249795 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-internal-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.249862 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-combined-ca-bundle\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.249953 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-public-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.250015 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252046 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252088 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-scripts\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-logs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252177 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-config-data\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252255 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964cd6aa-bebd-412e-bd1c-001d151a90e8-logs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252282 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9sj\" (UniqueName: \"kubernetes.io/projected/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-kube-api-access-mk9sj\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252319 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-public-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252381 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l777x\" (UniqueName: \"kubernetes.io/projected/964cd6aa-bebd-412e-bd1c-001d151a90e8-kube-api-access-l777x\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252420 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-config-data\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.252459 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.253858 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964cd6aa-bebd-412e-bd1c-001d151a90e8-logs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.259884 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.259830 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.260826 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.265120 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-config-data\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.275753 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964cd6aa-bebd-412e-bd1c-001d151a90e8-public-tls-certs\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.288380 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l777x\" (UniqueName: \"kubernetes.io/projected/964cd6aa-bebd-412e-bd1c-001d151a90e8-kube-api-access-l777x\") pod \"watcher-api-0\" (UID: \"964cd6aa-bebd-412e-bd1c-001d151a90e8\") " pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.354922 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-scripts\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.354971 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-logs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.355005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-config-data\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.355056 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9sj\" (UniqueName: \"kubernetes.io/projected/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-kube-api-access-mk9sj\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.355088 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-public-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.355146 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-internal-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.355177 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-combined-ca-bundle\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.356593 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-logs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.357724 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-scripts\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.358386 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-combined-ca-bundle\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.365409 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-public-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.369363 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-config-data\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.376466 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-internal-tls-certs\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.376554 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9sj\" (UniqueName: \"kubernetes.io/projected/a0bafb1e-cef8-4a8c-bb78-a5d11d098691-kube-api-access-mk9sj\") pod \"placement-56cfc8cc98-pfz9w\" (UID: \"a0bafb1e-cef8-4a8c-bb78-a5d11d098691\") " pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.542412 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.552607 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 08:28:58 crc kubenswrapper[4870]: I0130 08:28:58.771268 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:28:58 crc kubenswrapper[4870]: W0130 08:28:58.789032 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8628af25_d5e4_46a0_adec_4c25ca39676b.slice/crio-ac51c1c09a44741e0ef19b1f29a0de83b725ade9e973b9a5af1ac06cc507c0dc WatchSource:0}: Error finding container ac51c1c09a44741e0ef19b1f29a0de83b725ade9e973b9a5af1ac06cc507c0dc: Status 404 returned error can't find the container with id ac51c1c09a44741e0ef19b1f29a0de83b725ade9e973b9a5af1ac06cc507c0dc Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.137844 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56cfc8cc98-pfz9w"] Jan 30 08:28:59 crc kubenswrapper[4870]: W0130 08:28:59.145222 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bafb1e_cef8_4a8c_bb78_a5d11d098691.slice/crio-8af75310cadb631a54fb1863b40ad4d025d0b2489ced96ec922b0c3169cd9a29 WatchSource:0}: Error finding container 8af75310cadb631a54fb1863b40ad4d025d0b2489ced96ec922b0c3169cd9a29: Status 404 returned error can't find the container with id 8af75310cadb631a54fb1863b40ad4d025d0b2489ced96ec922b0c3169cd9a29 Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.246015 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.734974 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-769d7654db-gw44c" Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.784183 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"964cd6aa-bebd-412e-bd1c-001d151a90e8","Type":"ContainerStarted","Data":"0fc045b8752b23ed63b788c40396f5c241006226d574a9afb4b1fad123932dc3"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.784547 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"964cd6aa-bebd-412e-bd1c-001d151a90e8","Type":"ContainerStarted","Data":"58f3c594be8eea56316c01512e5ab0263514c3971eb0192f72124a7b62f71734"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.790317 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerStarted","Data":"6d57bab3d6c90d46f3ba26f44c864a6ec85718286b3ff7835459214c488726bc"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.790355 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerStarted","Data":"ac51c1c09a44741e0ef19b1f29a0de83b725ade9e973b9a5af1ac06cc507c0dc"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.817455 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.817730 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon-log" containerID="cri-o://43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25" gracePeriod=30 Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.817865 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" containerID="cri-o://960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3" gracePeriod=30 Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.818892 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.829288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cfc8cc98-pfz9w" event={"ID":"a0bafb1e-cef8-4a8c-bb78-a5d11d098691","Type":"ContainerStarted","Data":"5593830809dede98478a71c2749bca27e1d4bfb98c7d1db4c427feb53c451e33"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.829336 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cfc8cc98-pfz9w" event={"ID":"a0bafb1e-cef8-4a8c-bb78-a5d11d098691","Type":"ContainerStarted","Data":"8af75310cadb631a54fb1863b40ad4d025d0b2489ced96ec922b0c3169cd9a29"} Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.835769 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:43982->10.217.0.162:8443: read: connection reset by peer" Jan 30 08:28:59 crc kubenswrapper[4870]: I0130 08:28:59.847362 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.847337795 podStartE2EDuration="2.847337795s" podCreationTimestamp="2026-01-30 08:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:28:59.844472565 +0000 UTC m=+1178.540019684" watchObservedRunningTime="2026-01-30 08:28:59.847337795 +0000 UTC m=+1178.542884904" Jan 30 08:29:00 crc kubenswrapper[4870]: E0130 08:29:00.524064 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:00 crc kubenswrapper[4870]: E0130 08:29:00.526393 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:00 crc kubenswrapper[4870]: E0130 08:29:00.528549 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:00 crc kubenswrapper[4870]: E0130 08:29:00.528675 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.847184 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"964cd6aa-bebd-412e-bd1c-001d151a90e8","Type":"ContainerStarted","Data":"9171df5b2927af9eba44065ea758545f3bfbeb9a8d3fe9faa2aa3e871785a17d"} Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.847516 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.850023 4870 generic.go:334] "Generic (PLEG): container finished" podID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerID="960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3" exitCode=0 Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.850049 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerDied","Data":"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3"} Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.855636 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56cfc8cc98-pfz9w" event={"ID":"a0bafb1e-cef8-4a8c-bb78-a5d11d098691","Type":"ContainerStarted","Data":"b6151bfe62806a4b4bcfcd0ac7b669915ac2f5e32e8795136f885dfd849e126d"} Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.855713 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.895001 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56cfc8cc98-pfz9w" podStartSLOduration=2.894983941 podStartE2EDuration="2.894983941s" podCreationTimestamp="2026-01-30 08:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:00.892803893 +0000 UTC m=+1179.588351002" watchObservedRunningTime="2026-01-30 08:29:00.894983941 +0000 UTC m=+1179.590531060" Jan 30 08:29:00 crc kubenswrapper[4870]: I0130 08:29:00.902361 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.902339822 podStartE2EDuration="3.902339822s" podCreationTimestamp="2026-01-30 08:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:00.869049316 +0000 UTC m=+1179.564596425" watchObservedRunningTime="2026-01-30 08:29:00.902339822 +0000 UTC m=+1179.597886931" Jan 30 08:29:01 crc kubenswrapper[4870]: I0130 08:29:01.867798 4870 generic.go:334] "Generic (PLEG): container finished" podID="685bde78-dea1-4864-a825-af176178bd11" containerID="c723dc182803022ba9e618ac6407cbccb617a7c5a0a43457386f580c7a154614" exitCode=0 Jan 30 08:29:01 crc kubenswrapper[4870]: I0130 08:29:01.869226 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9g27p" event={"ID":"685bde78-dea1-4864-a825-af176178bd11","Type":"ContainerDied","Data":"c723dc182803022ba9e618ac6407cbccb617a7c5a0a43457386f580c7a154614"} Jan 30 08:29:01 crc kubenswrapper[4870]: I0130 08:29:01.870265 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:29:02 crc kubenswrapper[4870]: I0130 08:29:02.880114 4870 generic.go:334] "Generic (PLEG): container finished" podID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerID="6d57bab3d6c90d46f3ba26f44c864a6ec85718286b3ff7835459214c488726bc" exitCode=1 Jan 30 08:29:02 crc kubenswrapper[4870]: I0130 08:29:02.880165 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerDied","Data":"6d57bab3d6c90d46f3ba26f44c864a6ec85718286b3ff7835459214c488726bc"} Jan 30 08:29:02 crc kubenswrapper[4870]: I0130 08:29:02.882012 4870 scope.go:117] "RemoveContainer" containerID="6d57bab3d6c90d46f3ba26f44c864a6ec85718286b3ff7835459214c488726bc" Jan 30 08:29:03 crc kubenswrapper[4870]: I0130 08:29:03.327269 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 08:29:03 crc kubenswrapper[4870]: I0130 08:29:03.553562 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 08:29:03 crc kubenswrapper[4870]: I0130 08:29:03.890048 4870 generic.go:334] "Generic (PLEG): container finished" podID="c3bd649e-5c3c-495f-933f-3b516167cbd2" containerID="3e1279140ba8261786354ed9fbacfe2a2a43a2b8decaba7ca7c7b15754ed7ff9" exitCode=0 Jan 30 08:29:03 crc kubenswrapper[4870]: I0130 08:29:03.890788 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d2mx7" event={"ID":"c3bd649e-5c3c-495f-933f-3b516167cbd2","Type":"ContainerDied","Data":"3e1279140ba8261786354ed9fbacfe2a2a43a2b8decaba7ca7c7b15754ed7ff9"} Jan 30 08:29:04 crc kubenswrapper[4870]: I0130 08:29:04.711699 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 08:29:06 crc kubenswrapper[4870]: E0130 08:29:05.522045 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:06 crc kubenswrapper[4870]: E0130 08:29:05.523346 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:06 crc kubenswrapper[4870]: E0130 08:29:05.524557 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:06 crc kubenswrapper[4870]: E0130 08:29:05.524591 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:06 crc kubenswrapper[4870]: I0130 08:29:06.928560 4870 generic.go:334] "Generic (PLEG): container finished" podID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerID="8001ca067558561639146186888f3fefa9a3f66b8cfe6da27c20754262532feb" exitCode=137 Jan 30 08:29:06 crc kubenswrapper[4870]: I0130 08:29:06.928942 4870 generic.go:334] "Generic (PLEG): container finished" podID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerID="57f324a7af1ed982422d86d531cc1073fc7e06667530fd2daf47081062016e35" exitCode=137 Jan 30 08:29:06 crc kubenswrapper[4870]: I0130 08:29:06.928980 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerDied","Data":"8001ca067558561639146186888f3fefa9a3f66b8cfe6da27c20754262532feb"} Jan 30 08:29:06 crc kubenswrapper[4870]: I0130 08:29:06.929019 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerDied","Data":"57f324a7af1ed982422d86d531cc1073fc7e06667530fd2daf47081062016e35"} Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.123017 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.126354 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9g27p" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.179707 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle\") pod \"c3bd649e-5c3c-495f-933f-3b516167cbd2\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.179836 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data\") pod \"c3bd649e-5c3c-495f-933f-3b516167cbd2\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.179949 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.179979 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180036 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmq48\" (UniqueName: \"kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48\") pod \"c3bd649e-5c3c-495f-933f-3b516167cbd2\" (UID: \"c3bd649e-5c3c-495f-933f-3b516167cbd2\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180077 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180095 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180126 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180148 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgl2s\" (UniqueName: \"kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s\") pod \"685bde78-dea1-4864-a825-af176178bd11\" (UID: \"685bde78-dea1-4864-a825-af176178bd11\") " Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.180827 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.196330 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.196343 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts" (OuterVolumeSpecName: "scripts") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.196369 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c3bd649e-5c3c-495f-933f-3b516167cbd2" (UID: "c3bd649e-5c3c-495f-933f-3b516167cbd2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.196409 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48" (OuterVolumeSpecName: "kube-api-access-bmq48") pod "c3bd649e-5c3c-495f-933f-3b516167cbd2" (UID: "c3bd649e-5c3c-495f-933f-3b516167cbd2"). InnerVolumeSpecName "kube-api-access-bmq48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.196471 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s" (OuterVolumeSpecName: "kube-api-access-lgl2s") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "kube-api-access-lgl2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.211736 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.225751 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3bd649e-5c3c-495f-933f-3b516167cbd2" (UID: "c3bd649e-5c3c-495f-933f-3b516167cbd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.252409 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data" (OuterVolumeSpecName: "config-data") pod "685bde78-dea1-4864-a825-af176178bd11" (UID: "685bde78-dea1-4864-a825-af176178bd11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286503 4870 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286548 4870 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/685bde78-dea1-4864-a825-af176178bd11-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286560 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286575 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgl2s\" (UniqueName: \"kubernetes.io/projected/685bde78-dea1-4864-a825-af176178bd11-kube-api-access-lgl2s\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286589 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286601 4870 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3bd649e-5c3c-495f-933f-3b516167cbd2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286611 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286621 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685bde78-dea1-4864-a825-af176178bd11-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.286632 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmq48\" (UniqueName: \"kubernetes.io/projected/c3bd649e-5c3c-495f-933f-3b516167cbd2-kube-api-access-bmq48\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.939204 4870 generic.go:334] "Generic (PLEG): container finished" podID="505df376-c8bc-44ce-9c14-8cf94730c550" containerID="51fd04d1413a7bb8dd1010fcf50ab478d7211a73c87542e70aaae3ce82cc9053" exitCode=0 Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.939509 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9mjj4" event={"ID":"505df376-c8bc-44ce-9c14-8cf94730c550","Type":"ContainerDied","Data":"51fd04d1413a7bb8dd1010fcf50ab478d7211a73c87542e70aaae3ce82cc9053"} Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.943180 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9g27p" event={"ID":"685bde78-dea1-4864-a825-af176178bd11","Type":"ContainerDied","Data":"208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7"} Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.943222 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="208d52cf2eb05b40a226e4a1738a5480e549f0f08e4a6ad6b02178c49de677f7" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.943277 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9g27p" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.947163 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-d2mx7" event={"ID":"c3bd649e-5c3c-495f-933f-3b516167cbd2","Type":"ContainerDied","Data":"545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66"} Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.947206 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545706fcd20ca2bc6bb776e0ab7efebb3759d48b4ad6e3c5ac851eb2e476dd66" Jan 30 08:29:07 crc kubenswrapper[4870]: I0130 08:29:07.947269 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-d2mx7" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.218243 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.218295 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.413988 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:08 crc kubenswrapper[4870]: E0130 08:29:08.414624 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685bde78-dea1-4864-a825-af176178bd11" containerName="cinder-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.414645 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="685bde78-dea1-4864-a825-af176178bd11" containerName="cinder-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: E0130 08:29:08.414657 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" containerName="barbican-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.414664 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" containerName="barbican-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.415178 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" containerName="barbican-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.415199 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="685bde78-dea1-4864-a825-af176178bd11" containerName="cinder-db-sync" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.416302 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.438463 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.438668 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b94ff658f-bmntr"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.443699 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4blb4" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.444109 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.457983 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.463216 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.470635 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.471004 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmdf5" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.471288 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.498892 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508369 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm55z\" (UniqueName: \"kubernetes.io/projected/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-kube-api-access-qm55z\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508474 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-logs\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508501 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508522 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508561 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508574 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data-custom\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508619 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcs4s\" (UniqueName: \"kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508652 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508669 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.508695 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-combined-ca-bundle\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.524956 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b94ff658f-bmntr"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.551755 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-54fb8bddb6-w78xn"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.563213 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.563777 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.571760 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.623780 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54fb8bddb6-w78xn"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.629957 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630111 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcs4s\" (UniqueName: \"kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630222 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630272 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630319 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630349 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrrs\" (UniqueName: \"kubernetes.io/projected/8a32795f-6328-4d51-a69a-60be965b17f0-kube-api-access-ncrrs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data-custom\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630441 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-combined-ca-bundle\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630535 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm55z\" (UniqueName: \"kubernetes.io/projected/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-kube-api-access-qm55z\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630636 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a32795f-6328-4d51-a69a-60be965b17f0-logs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.630672 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.632730 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-logs\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.632771 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.634017 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.635425 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.639052 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-logs\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.639164 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.639189 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data-custom\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.648678 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.649157 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.649697 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.650079 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-config-data-custom\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.650320 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.652241 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.652595 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.664988 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95c8f6689-d4pfh"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.666553 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.670770 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-combined-ca-bundle\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.681537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcs4s\" (UniqueName: \"kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s\") pod \"cinder-scheduler-0\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.684644 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm55z\" (UniqueName: \"kubernetes.io/projected/a3bc44ff-bc04-4e44-bb13-ff62f43057f5-kube-api-access-qm55z\") pod \"barbican-worker-6b94ff658f-bmntr\" (UID: \"a3bc44ff-bc04-4e44-bb13-ff62f43057f5\") " pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.720919 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95c8f6689-d4pfh"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.741572 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.741851 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2968w\" (UniqueName: \"kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.741931 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.741954 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrrs\" (UniqueName: \"kubernetes.io/projected/8a32795f-6328-4d51-a69a-60be965b17f0-kube-api-access-ncrrs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.741999 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.742018 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data-custom\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.742101 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.745801 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a32795f-6328-4d51-a69a-60be965b17f0-logs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.745916 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.746058 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.746099 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.747833 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a32795f-6328-4d51-a69a-60be965b17f0-logs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.760843 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.762405 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.770889 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a32795f-6328-4d51-a69a-60be965b17f0-config-data-custom\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.771317 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.777225 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95c8f6689-d4pfh"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.801517 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrrs\" (UniqueName: \"kubernetes.io/projected/8a32795f-6328-4d51-a69a-60be965b17f0-kube-api-access-ncrrs\") pod \"barbican-keystone-listener-54fb8bddb6-w78xn\" (UID: \"8a32795f-6328-4d51-a69a-60be965b17f0\") " pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.812626 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b94ff658f-bmntr" Jan 30 08:29:08 crc kubenswrapper[4870]: E0130 08:29:08.852328 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-2968w ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" podUID="a59ee6c0-d68e-4e31-bf9e-1326d91c0633" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.867914 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2968w\" (UniqueName: \"kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.868300 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.869101 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.869760 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.869803 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.869921 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.869960 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.870442 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.870952 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.871117 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.874352 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.880275 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.881912 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.910065 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.911811 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.914151 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.919731 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2968w\" (UniqueName: \"kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w\") pod \"dnsmasq-dns-95c8f6689-d4pfh\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.920404 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.939956 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.959325 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991399 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgd6n\" (UniqueName: \"kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991526 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991546 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991638 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:08 crc kubenswrapper[4870]: I0130 08:29:08.991669 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.024260 4870 generic.go:334] "Generic (PLEG): container finished" podID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" containerID="85f1049088e388e69d6da33f4eab9143943bc4d4ba2179d9093657152d474310" exitCode=0 Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.024493 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tssp8" event={"ID":"edd09a42-14b6-4161-ba2a-82c4cf4f5983","Type":"ContainerDied","Data":"85f1049088e388e69d6da33f4eab9143943bc4d4ba2179d9093657152d474310"} Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.025786 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.028366 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.032097 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.041710 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.073057 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.080795 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.092826 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgd6n\" (UniqueName: \"kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093218 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093241 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093257 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093281 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093345 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093371 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093414 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093439 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093478 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdrz\" (UniqueName: \"kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.093504 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.094431 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.094746 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.095002 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.095482 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.095601 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.132923 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgd6n\" (UniqueName: \"kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n\") pod \"dnsmasq-dns-76ddf7d98c-drvjx\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201374 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201470 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201561 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201643 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201700 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2968w\" (UniqueName: \"kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.201783 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config\") pod \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\" (UID: \"a59ee6c0-d68e-4e31-bf9e-1326d91c0633\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202050 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdrz\" (UniqueName: \"kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202090 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202173 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202193 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhtln\" (UniqueName: \"kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202228 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202244 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202283 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202327 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202346 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202363 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202383 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.202416 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.203850 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config" (OuterVolumeSpecName: "config") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.204098 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.204307 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.204366 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.204752 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.206269 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.208063 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w" (OuterVolumeSpecName: "kube-api-access-2968w") pod "a59ee6c0-d68e-4e31-bf9e-1326d91c0633" (UID: "a59ee6c0-d68e-4e31-bf9e-1326d91c0633"). InnerVolumeSpecName "kube-api-access-2968w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.214236 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.221724 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.238539 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.240407 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdrz\" (UniqueName: \"kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz\") pod \"barbican-api-ddfdbf76d-mfqhx\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.244741 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.257751 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.259223 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303654 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303696 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhtln\" (UniqueName: \"kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303735 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303754 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303806 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.303826 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.304605 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.308265 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.311567 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312316 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312535 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2968w\" (UniqueName: \"kubernetes.io/projected/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-kube-api-access-2968w\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312549 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312559 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312568 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312577 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.312587 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a59ee6c0-d68e-4e31-bf9e-1326d91c0633-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.317376 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.318517 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.326342 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.344129 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhtln\" (UniqueName: \"kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln\") pod \"cinder-api-0\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.397952 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.686353 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.697742 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.820950 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data\") pod \"4171155c-1d8c-48a0-9675-1c730f9130dc\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821035 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config\") pod \"505df376-c8bc-44ce-9c14-8cf94730c550\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821365 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts\") pod \"4171155c-1d8c-48a0-9675-1c730f9130dc\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821395 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key\") pod \"4171155c-1d8c-48a0-9675-1c730f9130dc\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821426 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwsc\" (UniqueName: \"kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc\") pod \"505df376-c8bc-44ce-9c14-8cf94730c550\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821452 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle\") pod \"505df376-c8bc-44ce-9c14-8cf94730c550\" (UID: \"505df376-c8bc-44ce-9c14-8cf94730c550\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821518 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj77z\" (UniqueName: \"kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z\") pod \"4171155c-1d8c-48a0-9675-1c730f9130dc\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.821613 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs\") pod \"4171155c-1d8c-48a0-9675-1c730f9130dc\" (UID: \"4171155c-1d8c-48a0-9675-1c730f9130dc\") " Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.822407 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs" (OuterVolumeSpecName: "logs") pod "4171155c-1d8c-48a0-9675-1c730f9130dc" (UID: "4171155c-1d8c-48a0-9675-1c730f9130dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.830944 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4171155c-1d8c-48a0-9675-1c730f9130dc" (UID: "4171155c-1d8c-48a0-9675-1c730f9130dc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.837413 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z" (OuterVolumeSpecName: "kube-api-access-xj77z") pod "4171155c-1d8c-48a0-9675-1c730f9130dc" (UID: "4171155c-1d8c-48a0-9675-1c730f9130dc"). InnerVolumeSpecName "kube-api-access-xj77z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.834309 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc" (OuterVolumeSpecName: "kube-api-access-7rwsc") pod "505df376-c8bc-44ce-9c14-8cf94730c550" (UID: "505df376-c8bc-44ce-9c14-8cf94730c550"). InnerVolumeSpecName "kube-api-access-7rwsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.849215 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data" (OuterVolumeSpecName: "config-data") pod "4171155c-1d8c-48a0-9675-1c730f9130dc" (UID: "4171155c-1d8c-48a0-9675-1c730f9130dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.856836 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config" (OuterVolumeSpecName: "config") pod "505df376-c8bc-44ce-9c14-8cf94730c550" (UID: "505df376-c8bc-44ce-9c14-8cf94730c550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.861829 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "505df376-c8bc-44ce-9c14-8cf94730c550" (UID: "505df376-c8bc-44ce-9c14-8cf94730c550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.869812 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts" (OuterVolumeSpecName: "scripts") pod "4171155c-1d8c-48a0-9675-1c730f9130dc" (UID: "4171155c-1d8c-48a0-9675-1c730f9130dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923170 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923197 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923210 4870 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4171155c-1d8c-48a0-9675-1c730f9130dc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923219 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwsc\" (UniqueName: \"kubernetes.io/projected/505df376-c8bc-44ce-9c14-8cf94730c550-kube-api-access-7rwsc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923228 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505df376-c8bc-44ce-9c14-8cf94730c550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923394 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj77z\" (UniqueName: \"kubernetes.io/projected/4171155c-1d8c-48a0-9675-1c730f9130dc-kube-api-access-xj77z\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923402 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4171155c-1d8c-48a0-9675-1c730f9130dc-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:09 crc kubenswrapper[4870]: I0130 08:29:09.923410 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4171155c-1d8c-48a0-9675-1c730f9130dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.124124 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9mjj4" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.137234 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95c8f6689-d4pfh" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.137263 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5949fbc84f-vdxjp" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.278309 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b94ff658f-bmntr"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.278641 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.278656 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.278962 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505df376-c8bc-44ce-9c14-8cf94730c550" containerName="neutron-db-sync" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.278976 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="505df376-c8bc-44ce-9c14-8cf94730c550" containerName="neutron-db-sync" Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.279003 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon-log" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.279009 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon-log" Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.279023 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.279031 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.279180 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="505df376-c8bc-44ce-9c14-8cf94730c550" containerName="neutron-db-sync" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.279205 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.279214 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" containerName="horizon-log" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.280532 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9mjj4" event={"ID":"505df376-c8bc-44ce-9c14-8cf94730c550","Type":"ContainerDied","Data":"171c5c07c9fb91c243425bc5e80be08d88ca8fe65555c57f93bf344f77f94faf"} Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.280566 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="171c5c07c9fb91c243425bc5e80be08d88ca8fe65555c57f93bf344f77f94faf" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.280576 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5949fbc84f-vdxjp" event={"ID":"4171155c-1d8c-48a0-9675-1c730f9130dc","Type":"ContainerDied","Data":"3cd56b7866047c6a3192e1f27cec1f489317b2b2d8b6d0b806475464e27ca26f"} Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.280597 4870 scope.go:117] "RemoveContainer" containerID="8001ca067558561639146186888f3fefa9a3f66b8cfe6da27c20754262532feb" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.280750 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.289824 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.433756 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.433793 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.433847 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.433868 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.433964 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47htr\" (UniqueName: \"kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.434010 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.532373 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535189 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535390 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47htr\" (UniqueName: \"kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535444 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535475 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.535490 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.536384 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.536572 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.536992 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.538961 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.539572 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.545360 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.555641 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:10 crc kubenswrapper[4870]: E0130 08:29:10.556028 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.570706 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47htr\" (UniqueName: \"kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr\") pod \"dnsmasq-dns-656959885f-8m9f8\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.591288 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.593091 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.596442 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.596701 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.596719 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.596760 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nnfmm" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.606665 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95c8f6689-d4pfh"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.621388 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95c8f6689-d4pfh"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.630007 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.642966 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.652276 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5949fbc84f-vdxjp"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.698462 4870 scope.go:117] "RemoveContainer" containerID="57f324a7af1ed982422d86d531cc1073fc7e06667530fd2daf47081062016e35" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.743691 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.743742 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8xs7\" (UniqueName: \"kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.743787 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.744068 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.744153 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.816974 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.845873 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.845942 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.846009 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.846053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8xs7\" (UniqueName: \"kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.846103 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.859508 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.862101 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.862574 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.865219 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.869147 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.887774 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.907652 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.917215 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8xs7\" (UniqueName: \"kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7\") pod \"neutron-f966fd88d-sdpcn\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:10 crc kubenswrapper[4870]: I0130 08:29:10.934954 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.171963 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b94ff658f-bmntr" event={"ID":"a3bc44ff-bc04-4e44-bb13-ff62f43057f5","Type":"ContainerStarted","Data":"d8b67a8dbdaa2aa5ce60c6293caf9259100b637b6095951bf5eb97842cd01f10"} Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.174195 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerStarted","Data":"1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42"} Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.175508 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerStarted","Data":"5eca3df12794c5b43fdb77c898c9bd28c39f3103bd50eb3571fc088c025d0cf9"} Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.179149 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerStarted","Data":"104aebc0ebfaddceac864d04e025d33e5cf5c1cacba0ac14afe67f72e174ead6"} Jan 30 08:29:11 crc kubenswrapper[4870]: E0130 08:29:11.189631 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0: Get \\\"http://38.102.83.23:5001/v2/podified-master-centos10/openstack-ceilometer-central/blobs/sha256:bd27ff135622ee80d6d6693f9c0bf8e444ef41832cda5564c2025ca13b50eaf0\\\": context canceled\"" pod="openstack/ceilometer-0" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.336344 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tssp8" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.352973 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.407950 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.424075 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54fb8bddb6-w78xn"] Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.479560 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data\") pod \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.479730 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle\") pod \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.479785 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data\") pod \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.479821 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8n8\" (UniqueName: \"kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8\") pod \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\" (UID: \"edd09a42-14b6-4161-ba2a-82c4cf4f5983\") " Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.551735 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8" (OuterVolumeSpecName: "kube-api-access-7n8n8") pod "edd09a42-14b6-4161-ba2a-82c4cf4f5983" (UID: "edd09a42-14b6-4161-ba2a-82c4cf4f5983"). InnerVolumeSpecName "kube-api-access-7n8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.584332 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8n8\" (UniqueName: \"kubernetes.io/projected/edd09a42-14b6-4161-ba2a-82c4cf4f5983-kube-api-access-7n8n8\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.595130 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edd09a42-14b6-4161-ba2a-82c4cf4f5983" (UID: "edd09a42-14b6-4161-ba2a-82c4cf4f5983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.604056 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "edd09a42-14b6-4161-ba2a-82c4cf4f5983" (UID: "edd09a42-14b6-4161-ba2a-82c4cf4f5983"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.615513 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.650935 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data" (OuterVolumeSpecName: "config-data") pod "edd09a42-14b6-4161-ba2a-82c4cf4f5983" (UID: "edd09a42-14b6-4161-ba2a-82c4cf4f5983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.688109 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.688146 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.688175 4870 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/edd09a42-14b6-4161-ba2a-82c4cf4f5983-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:11 crc kubenswrapper[4870]: I0130 08:29:11.975740 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.128347 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4171155c-1d8c-48a0-9675-1c730f9130dc" path="/var/lib/kubelet/pods/4171155c-1d8c-48a0-9675-1c730f9130dc/volumes" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.129086 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59ee6c0-d68e-4e31-bf9e-1326d91c0633" path="/var/lib/kubelet/pods/a59ee6c0-d68e-4e31-bf9e-1326d91c0633/volumes" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.217722 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" event={"ID":"8a32795f-6328-4d51-a69a-60be965b17f0","Type":"ContainerStarted","Data":"8fc75aa3cc115bc611e81fa24f5880ce7c89467cc1852eadde5f662ffbd94434"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.221059 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerStarted","Data":"4bfee3b8db156e8f632e1b810fed5fff1f01f89361fe371e88524396b0b3e740"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.221093 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerStarted","Data":"4fefb5421067779e4ffb7448501feacbbd8e1262345c29ebcf35ade1e4bf9f85"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.229127 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerStarted","Data":"c24e43cc0a18413bd641a93cefd91035b18efb88407c422b061290231df7fca8"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.229260 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="ceilometer-notification-agent" containerID="cri-o://b45a58a1e3e4865b397313616e6494da5d8e1887dd9401a657303e526b984274" gracePeriod=30 Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.229338 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.230225 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="proxy-httpd" containerID="cri-o://c24e43cc0a18413bd641a93cefd91035b18efb88407c422b061290231df7fca8" gracePeriod=30 Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.230661 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="sg-core" containerID="cri-o://eb3296110237841074b9a042e7cdf380a70fa9819f62258f821b5124b40eb835" gracePeriod=30 Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.234114 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerStarted","Data":"cfedb81ba7d9e195fe41ff8a768c117183039fb6da240c26ec467012add1460c"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.237993 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656959885f-8m9f8" event={"ID":"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa","Type":"ContainerStarted","Data":"27ed939972c481c8d851e169e12137d6f31939cc5f5f7a13dd1d93784b1b442d"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.247549 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tssp8" event={"ID":"edd09a42-14b6-4161-ba2a-82c4cf4f5983","Type":"ContainerDied","Data":"8bfad17f6d235c11635a3d5c597e4e8cad4341b4b72906e827a01ca540cffaac"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.247586 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bfad17f6d235c11635a3d5c597e4e8cad4341b4b72906e827a01ca540cffaac" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.247646 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tssp8" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.255579 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" event={"ID":"c58c61f5-64cf-4fe3-9792-a9d7b0987188","Type":"ContainerStarted","Data":"0d8b1a72f03d9ad69f8ac9e36fd394f650e5d7b9d43ccffdaff2ad1160e3aeef"} Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.808255 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.851970 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:29:12 crc kubenswrapper[4870]: E0130 08:29:12.856101 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" containerName="glance-db-sync" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.856131 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" containerName="glance-db-sync" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.867958 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" containerName="glance-db-sync" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.870877 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:12 crc kubenswrapper[4870]: I0130 08:29:12.889431 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.051376 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.052586 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.052712 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.052781 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xpwr\" (UniqueName: \"kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.052904 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.052987 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154381 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154428 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154515 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154552 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154591 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.154605 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xpwr\" (UniqueName: \"kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.156373 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.156392 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.157189 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.157916 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.158614 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.176750 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xpwr\" (UniqueName: \"kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr\") pod \"dnsmasq-dns-5d675956bc-zzkss\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.215866 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.295299 4870 generic.go:334] "Generic (PLEG): container finished" podID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" containerID="34ec090b045bffd5d1230bb03e6f1ad024b19b6f7c9f069106bed74b50df542e" exitCode=0 Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.295373 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656959885f-8m9f8" event={"ID":"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa","Type":"ContainerDied","Data":"34ec090b045bffd5d1230bb03e6f1ad024b19b6f7c9f069106bed74b50df542e"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.311114 4870 generic.go:334] "Generic (PLEG): container finished" podID="c58c61f5-64cf-4fe3-9792-a9d7b0987188" containerID="716b6800404b937b1b37ac9d66dd42946bf82bd2065094be653871d4c6645e5a" exitCode=0 Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.311192 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" event={"ID":"c58c61f5-64cf-4fe3-9792-a9d7b0987188","Type":"ContainerDied","Data":"716b6800404b937b1b37ac9d66dd42946bf82bd2065094be653871d4c6645e5a"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.339448 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerStarted","Data":"135b22419515f0c37fa07d5cae62ea43515dc6460498408338173f7df4e2361b"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.384315 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerStarted","Data":"43e89a562f59d4e0caa713b4d3d7d10459a54c334d8bf93738ef4bfb17bc36b1"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.442175 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerStarted","Data":"9a536688d050dc6432091788f5363412218a7bb425a3e180118973e359516afe"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.442260 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.442281 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.464133 4870 generic.go:334] "Generic (PLEG): container finished" podID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerID="c24e43cc0a18413bd641a93cefd91035b18efb88407c422b061290231df7fca8" exitCode=0 Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.464189 4870 generic.go:334] "Generic (PLEG): container finished" podID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerID="eb3296110237841074b9a042e7cdf380a70fa9819f62258f821b5124b40eb835" exitCode=2 Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.464164 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerDied","Data":"c24e43cc0a18413bd641a93cefd91035b18efb88407c422b061290231df7fca8"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.464364 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerDied","Data":"eb3296110237841074b9a042e7cdf380a70fa9819f62258f821b5124b40eb835"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.486254 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerStarted","Data":"8e106b0c6b2ed513250f13c043895b69dbe1cd77d36b5ecd4e47e2f2226b112e"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.486298 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerStarted","Data":"b1d3aae9bf64c7d5adb6c7a0c0cef4cbd05ceee79bf2bdc26b1676e0ef8ac7ff"} Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.487412 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.491637 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podStartSLOduration=5.491618924 podStartE2EDuration="5.491618924s" podCreationTimestamp="2026-01-30 08:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:13.475295252 +0000 UTC m=+1192.170842361" watchObservedRunningTime="2026-01-30 08:29:13.491618924 +0000 UTC m=+1192.187166033" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.524906 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f966fd88d-sdpcn" podStartSLOduration=3.52488899 podStartE2EDuration="3.52488899s" podCreationTimestamp="2026-01-30 08:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:13.518287393 +0000 UTC m=+1192.213834502" watchObservedRunningTime="2026-01-30 08:29:13.52488899 +0000 UTC m=+1192.220436089" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.849590 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.855446 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.859434 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-58ht6" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.860450 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.860975 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.889018 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.978991 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.980791 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982569 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982628 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982658 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982769 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982831 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfc5p\" (UniqueName: \"kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982864 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.982907 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.983996 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 08:29:13 crc kubenswrapper[4870]: I0130 08:29:13.993966 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085511 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085575 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085601 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085627 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085655 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085678 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085698 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085720 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4qh\" (UniqueName: \"kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085765 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085815 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfc5p\" (UniqueName: \"kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085847 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085877 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.085914 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.088043 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.099139 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.102663 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.102906 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.103132 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.135443 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfc5p\" (UniqueName: \"kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.149777 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.149915 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.157042 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187265 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187313 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187335 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4qh\" (UniqueName: \"kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187537 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187574 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187606 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.187637 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.190874 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.191440 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.192806 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.193073 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.193435 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.195614 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.196908 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.219003 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4qh\" (UniqueName: \"kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.241167 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.309984 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.498034 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerStarted","Data":"15d4837d5c345debdddee40f4775635705eb028fe7633d8e6d5c855f92746c7a"} Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.498143 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api-log" containerID="cri-o://135b22419515f0c37fa07d5cae62ea43515dc6460498408338173f7df4e2361b" gracePeriod=30 Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.498217 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api" containerID="cri-o://15d4837d5c345debdddee40f4775635705eb028fe7633d8e6d5c855f92746c7a" gracePeriod=30 Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.524514 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.524497007 podStartE2EDuration="6.524497007s" podCreationTimestamp="2026-01-30 08:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:14.522801983 +0000 UTC m=+1193.218349082" watchObservedRunningTime="2026-01-30 08:29:14.524497007 +0000 UTC m=+1193.220044116" Jan 30 08:29:14 crc kubenswrapper[4870]: W0130 08:29:14.656192 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3688605b_306e_4093_93d5_b96cae2a80de.slice/crio-38880a59bd8e8ad87943840cbaa98251faf8d234264c0ca4ae49cc1e495e8ef5 WatchSource:0}: Error finding container 38880a59bd8e8ad87943840cbaa98251faf8d234264c0ca4ae49cc1e495e8ef5: Status 404 returned error can't find the container with id 38880a59bd8e8ad87943840cbaa98251faf8d234264c0ca4ae49cc1e495e8ef5 Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.714198 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.714428 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.774156 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.915493 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.915833 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.915950 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.916117 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.916152 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.916225 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgd6n\" (UniqueName: \"kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n\") pod \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\" (UID: \"c58c61f5-64cf-4fe3-9792-a9d7b0987188\") " Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.925060 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n" (OuterVolumeSpecName: "kube-api-access-sgd6n") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "kube-api-access-sgd6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.947135 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.955433 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.958161 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.964657 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config" (OuterVolumeSpecName: "config") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:14 crc kubenswrapper[4870]: I0130 08:29:14.967691 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c58c61f5-64cf-4fe3-9792-a9d7b0987188" (UID: "c58c61f5-64cf-4fe3-9792-a9d7b0987188"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021061 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgd6n\" (UniqueName: \"kubernetes.io/projected/c58c61f5-64cf-4fe3-9792-a9d7b0987188-kube-api-access-sgd6n\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021107 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021121 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021131 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021140 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.021149 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c58c61f5-64cf-4fe3-9792-a9d7b0987188-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.538027 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" event={"ID":"c58c61f5-64cf-4fe3-9792-a9d7b0987188","Type":"ContainerDied","Data":"0d8b1a72f03d9ad69f8ac9e36fd394f650e5d7b9d43ccffdaff2ad1160e3aeef"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.538085 4870 scope.go:117] "RemoveContainer" containerID="716b6800404b937b1b37ac9d66dd42946bf82bd2065094be653871d4c6645e5a" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.538223 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ddf7d98c-drvjx" Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.544545 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.548717 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.559941 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.560018 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.567051 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" event={"ID":"3688605b-306e-4093-93d5-b96cae2a80de","Type":"ContainerStarted","Data":"38880a59bd8e8ad87943840cbaa98251faf8d234264c0ca4ae49cc1e495e8ef5"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.584107 4870 generic.go:334] "Generic (PLEG): container finished" podID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerID="1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42" exitCode=1 Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.584162 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerDied","Data":"1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.584693 4870 scope.go:117] "RemoveContainer" containerID="1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42" Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.585513 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8628af25-d5e4-46a0-adec-4c25ca39676b)\"" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.632881 4870 generic.go:334] "Generic (PLEG): container finished" podID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerID="135b22419515f0c37fa07d5cae62ea43515dc6460498408338173f7df4e2361b" exitCode=143 Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.633113 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerDied","Data":"135b22419515f0c37fa07d5cae62ea43515dc6460498408338173f7df4e2361b"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.641858 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerStarted","Data":"8eb8ea85632818a452e1edbc831618a751f9abde5e34ca94d13504592653fdb8"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.656361 4870 generic.go:334] "Generic (PLEG): container finished" podID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerID="b45a58a1e3e4865b397313616e6494da5d8e1887dd9401a657303e526b984274" exitCode=0 Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.657652 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerDied","Data":"b45a58a1e3e4865b397313616e6494da5d8e1887dd9401a657303e526b984274"} Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.682836 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.776160973 podStartE2EDuration="7.68282084s" podCreationTimestamp="2026-01-30 08:29:08 +0000 UTC" firstStartedPulling="2026-01-30 08:29:10.889835039 +0000 UTC m=+1189.585382148" lastFinishedPulling="2026-01-30 08:29:11.796494896 +0000 UTC m=+1190.492042015" observedRunningTime="2026-01-30 08:29:15.672204997 +0000 UTC m=+1194.367752106" watchObservedRunningTime="2026-01-30 08:29:15.68282084 +0000 UTC m=+1194.378367949" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.762415 4870 scope.go:117] "RemoveContainer" containerID="6d57bab3d6c90d46f3ba26f44c864a6ec85718286b3ff7835459214c488726bc" Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.821658 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:15 crc kubenswrapper[4870]: I0130 08:29:15.830485 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76ddf7d98c-drvjx"] Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.922135 4870 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 08:29:15 crc kubenswrapper[4870]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 08:29:15 crc kubenswrapper[4870]: > podSandboxID="27ed939972c481c8d851e169e12137d6f31939cc5f5f7a13dd1d93784b1b442d" Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.922484 4870 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 08:29:15 crc kubenswrapper[4870]: container &Container{Name:dnsmasq-dns,Image:38.102.83.23:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n598h74h59ch8ch64h599hf9hf7h668hdch8ch597h65bh59ch8dh6hc7h86h57fh649h75h586h655h57fh58bh54dh564h5b8h68fh54bh55h56dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-47htr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-656959885f-8m9f8_openstack(9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 08:29:15 crc kubenswrapper[4870]: > logger="UnhandledError" Jan 30 08:29:15 crc kubenswrapper[4870]: E0130 08:29:15.924687 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-656959885f-8m9f8" podUID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.096176 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58c61f5-64cf-4fe3-9792-a9d7b0987188" path="/var/lib/kubelet/pods/c58c61f5-64cf-4fe3-9792-a9d7b0987188/volumes" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.229733 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.355650 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdb9v\" (UniqueName: \"kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.355714 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.355753 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.355782 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.355922 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.356009 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.356051 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts\") pod \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\" (UID: \"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2\") " Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.360093 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.360316 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts" (OuterVolumeSpecName: "scripts") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.360566 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.373215 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v" (OuterVolumeSpecName: "kube-api-access-gdb9v") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "kube-api-access-gdb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.434482 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.435454 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.460541 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.460571 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdb9v\" (UniqueName: \"kubernetes.io/projected/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-kube-api-access-gdb9v\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.460584 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.460596 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.460606 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.463161 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.523469 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.524340 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data" (OuterVolumeSpecName: "config-data") pod "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" (UID: "3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:16 crc kubenswrapper[4870]: W0130 08:29:16.530292 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f83ba22_1075_4159_b19d_f0b9ceec4ac3.slice/crio-8eeee39b66f7aa76b98a5db5ced5f5152684b0c7a60879e1f3f2cf389449b18e WatchSource:0}: Error finding container 8eeee39b66f7aa76b98a5db5ced5f5152684b0c7a60879e1f3f2cf389449b18e: Status 404 returned error can't find the container with id 8eeee39b66f7aa76b98a5db5ced5f5152684b0c7a60879e1f3f2cf389449b18e Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.561703 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.561732 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.691528 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b94ff658f-bmntr" event={"ID":"a3bc44ff-bc04-4e44-bb13-ff62f43057f5","Type":"ContainerStarted","Data":"3298eed18edf06e451a427ebc9742fc8332e80a14168a95a23ab3fad8b1cb025"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.698433 4870 generic.go:334] "Generic (PLEG): container finished" podID="3688605b-306e-4093-93d5-b96cae2a80de" containerID="e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8" exitCode=0 Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.698692 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" event={"ID":"3688605b-306e-4093-93d5-b96cae2a80de","Type":"ContainerDied","Data":"e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.734157 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" event={"ID":"8a32795f-6328-4d51-a69a-60be965b17f0","Type":"ContainerStarted","Data":"bc53f0af1a8dc4650c6534fd0f5b47e13a20ad4020c59618ab83f14b87f5c1fb"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.821187 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2","Type":"ContainerDied","Data":"0a2dbcce6be2e5137bdbc1dec4f8f525f5301e8818ebf37e38952868cb263db6"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.821417 4870 scope.go:117] "RemoveContainer" containerID="c24e43cc0a18413bd641a93cefd91035b18efb88407c422b061290231df7fca8" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.821613 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.826424 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerStarted","Data":"8eeee39b66f7aa76b98a5db5ced5f5152684b0c7a60879e1f3f2cf389449b18e"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.827277 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerStarted","Data":"fab1f41c1ff636465358dde7b3c49c985eff8cd0920b9b81dab7f18f354ae31d"} Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.886301 4870 scope.go:117] "RemoveContainer" containerID="eb3296110237841074b9a042e7cdf380a70fa9819f62258f821b5124b40eb835" Jan 30 08:29:16 crc kubenswrapper[4870]: I0130 08:29:16.961846 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.045978 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075056 4870 scope.go:117] "RemoveContainer" containerID="b45a58a1e3e4865b397313616e6494da5d8e1887dd9401a657303e526b984274" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075195 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:17 crc kubenswrapper[4870]: E0130 08:29:17.075591 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="proxy-httpd" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075608 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="proxy-httpd" Jan 30 08:29:17 crc kubenswrapper[4870]: E0130 08:29:17.075624 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="sg-core" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075631 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="sg-core" Jan 30 08:29:17 crc kubenswrapper[4870]: E0130 08:29:17.075645 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="ceilometer-notification-agent" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075651 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="ceilometer-notification-agent" Jan 30 08:29:17 crc kubenswrapper[4870]: E0130 08:29:17.075661 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58c61f5-64cf-4fe3-9792-a9d7b0987188" containerName="init" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075667 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58c61f5-64cf-4fe3-9792-a9d7b0987188" containerName="init" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075856 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="ceilometer-notification-agent" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075870 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="proxy-httpd" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075925 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" containerName="sg-core" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.075938 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58c61f5-64cf-4fe3-9792-a9d7b0987188" containerName="init" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.077700 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.087015 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.087211 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.094914 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.193792 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.198548 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jkw\" (UniqueName: \"kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.198637 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.198912 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.199060 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.199152 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.199180 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302234 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jkw\" (UniqueName: \"kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302553 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302589 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302614 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302633 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.302668 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.303260 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.304008 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d69bf9957-gj6dt"] Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.312136 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.327719 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.329602 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.330866 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d69bf9957-gj6dt"] Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.330979 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.335449 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.337309 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.337487 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.343513 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.347485 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jkw\" (UniqueName: \"kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw\") pod \"ceilometer-0\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.405656 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zhw\" (UniqueName: \"kubernetes.io/projected/a50dec5c-d013-42b7-8a60-c405d5c93362-kube-api-access-59zhw\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.405720 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-internal-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.405744 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.405807 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-combined-ca-bundle\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.405868 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-httpd-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.406008 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-ovndb-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.406081 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-public-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.509361 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510108 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-internal-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510158 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510215 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-combined-ca-bundle\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510272 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-httpd-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510299 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-ovndb-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510361 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-public-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.510407 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59zhw\" (UniqueName: \"kubernetes.io/projected/a50dec5c-d013-42b7-8a60-c405d5c93362-kube-api-access-59zhw\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.522645 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-public-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.522868 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.523233 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-internal-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.523579 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-ovndb-tls-certs\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.523671 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-combined-ca-bundle\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.529675 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zhw\" (UniqueName: \"kubernetes.io/projected/a50dec5c-d013-42b7-8a60-c405d5c93362-kube-api-access-59zhw\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.533377 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a50dec5c-d013-42b7-8a60-c405d5c93362-httpd-config\") pod \"neutron-6d69bf9957-gj6dt\" (UID: \"a50dec5c-d013-42b7-8a60-c405d5c93362\") " pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.538151 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.611494 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.611826 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47htr\" (UniqueName: \"kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.611949 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.611976 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.612042 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.612105 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config\") pod \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\" (UID: \"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa\") " Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.634090 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr" (OuterVolumeSpecName: "kube-api-access-47htr") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "kube-api-access-47htr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.684913 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.715266 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.715302 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47htr\" (UniqueName: \"kubernetes.io/projected/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-kube-api-access-47htr\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.760390 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.790373 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.822221 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.847240 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b94ff658f-bmntr" event={"ID":"a3bc44ff-bc04-4e44-bb13-ff62f43057f5","Type":"ContainerStarted","Data":"901d7ab5cd075a77d8bbaef11b7f69e428520ec1d79e2ffb17e7fc2f4b0ce91b"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.849682 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" event={"ID":"3688605b-306e-4093-93d5-b96cae2a80de","Type":"ContainerStarted","Data":"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.849938 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.856304 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" event={"ID":"8a32795f-6328-4d51-a69a-60be965b17f0","Type":"ContainerStarted","Data":"73a46e911f963119f7efe1f461bfa36e17f07e688eb4337a0912046a69817e39"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.860103 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config" (OuterVolumeSpecName: "config") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.865768 4870 generic.go:334] "Generic (PLEG): container finished" podID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" exitCode=137 Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.865825 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d501bb9c-d88d-4362-a48e-4d0347ecc90e","Type":"ContainerDied","Data":"ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.883850 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656959885f-8m9f8" event={"ID":"9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa","Type":"ContainerDied","Data":"27ed939972c481c8d851e169e12137d6f31939cc5f5f7a13dd1d93784b1b442d"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.883919 4870 scope.go:117] "RemoveContainer" containerID="34ec090b045bffd5d1230bb03e6f1ad024b19b6f7c9f069106bed74b50df542e" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.884009 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656959885f-8m9f8" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.887291 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b94ff658f-bmntr" podStartSLOduration=4.6205938159999995 podStartE2EDuration="9.887271241s" podCreationTimestamp="2026-01-30 08:29:08 +0000 UTC" firstStartedPulling="2026-01-30 08:29:10.341283744 +0000 UTC m=+1189.036830853" lastFinishedPulling="2026-01-30 08:29:15.607961169 +0000 UTC m=+1194.303508278" observedRunningTime="2026-01-30 08:29:17.865399724 +0000 UTC m=+1196.560946833" watchObservedRunningTime="2026-01-30 08:29:17.887271241 +0000 UTC m=+1196.582818350" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.888459 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-54fb8bddb6-w78xn" podStartSLOduration=5.899615374 podStartE2EDuration="9.888454369s" podCreationTimestamp="2026-01-30 08:29:08 +0000 UTC" firstStartedPulling="2026-01-30 08:29:11.746807275 +0000 UTC m=+1190.442354384" lastFinishedPulling="2026-01-30 08:29:15.73564627 +0000 UTC m=+1194.431193379" observedRunningTime="2026-01-30 08:29:17.886346983 +0000 UTC m=+1196.581894092" watchObservedRunningTime="2026-01-30 08:29:17.888454369 +0000 UTC m=+1196.584001478" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.897236 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerStarted","Data":"2d530828db1e13fb59b8a6f3bf5f6b711b1aef5265c44e2042172678ae798ad8"} Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.907101 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.914151 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55b585f57f-9h2lg" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.924662 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.924699 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.932827 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" (UID: "9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:17 crc kubenswrapper[4870]: I0130 08:29:17.963575 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" podStartSLOduration=5.963557168 podStartE2EDuration="5.963557168s" podCreationTimestamp="2026-01-30 08:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:17.913391402 +0000 UTC m=+1196.608938511" watchObservedRunningTime="2026-01-30 08:29:17.963557168 +0000 UTC m=+1196.659104277" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.026164 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.096198 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2" path="/var/lib/kubelet/pods/3bf6a0a0-7d14-4cbd-96e4-c81ac5366fb2/volumes" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.172685 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:18 crc kubenswrapper[4870]: W0130 08:29:18.182252 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbebd196d_f8e4_466e_aa1f_99a65e3c7c6f.slice/crio-ea0f16d0885c4f830bf99a5da540d693451546a651ab72659db0f8c0dde59721 WatchSource:0}: Error finding container ea0f16d0885c4f830bf99a5da540d693451546a651ab72659db0f8c0dde59721: Status 404 returned error can't find the container with id ea0f16d0885c4f830bf99a5da540d693451546a651ab72659db0f8c0dde59721 Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.217993 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.218369 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.223145 4870 scope.go:117] "RemoveContainer" containerID="1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42" Jan 30 08:29:18 crc kubenswrapper[4870]: E0130 08:29:18.223547 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8628af25-d5e4-46a0-adec-4c25ca39676b)\"" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.283950 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.297932 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-656959885f-8m9f8"] Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.384471 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.438759 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs\") pod \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.438812 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data\") pod \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.438862 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqd86\" (UniqueName: \"kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86\") pod \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.438941 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle\") pod \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\" (UID: \"d501bb9c-d88d-4362-a48e-4d0347ecc90e\") " Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.439202 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs" (OuterVolumeSpecName: "logs") pod "d501bb9c-d88d-4362-a48e-4d0347ecc90e" (UID: "d501bb9c-d88d-4362-a48e-4d0347ecc90e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.439494 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d501bb9c-d88d-4362-a48e-4d0347ecc90e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.465108 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86" (OuterVolumeSpecName: "kube-api-access-gqd86") pod "d501bb9c-d88d-4362-a48e-4d0347ecc90e" (UID: "d501bb9c-d88d-4362-a48e-4d0347ecc90e"). InnerVolumeSpecName "kube-api-access-gqd86". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.501046 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d501bb9c-d88d-4362-a48e-4d0347ecc90e" (UID: "d501bb9c-d88d-4362-a48e-4d0347ecc90e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.541853 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqd86\" (UniqueName: \"kubernetes.io/projected/d501bb9c-d88d-4362-a48e-4d0347ecc90e-kube-api-access-gqd86\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.541909 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.545021 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data" (OuterVolumeSpecName: "config-data") pod "d501bb9c-d88d-4362-a48e-4d0347ecc90e" (UID: "d501bb9c-d88d-4362-a48e-4d0347ecc90e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.643240 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d501bb9c-d88d-4362-a48e-4d0347ecc90e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.774676 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.802640 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d69bf9957-gj6dt"] Jan 30 08:29:18 crc kubenswrapper[4870]: W0130 08:29:18.812940 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda50dec5c_d013_42b7_8a60_c405d5c93362.slice/crio-79e7e3be0b7084487efb953a05edee78110e607fdba9d97ffe461391c9aa3bcb WatchSource:0}: Error finding container 79e7e3be0b7084487efb953a05edee78110e607fdba9d97ffe461391c9aa3bcb: Status 404 returned error can't find the container with id 79e7e3be0b7084487efb953a05edee78110e607fdba9d97ffe461391c9aa3bcb Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.923145 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69bf9957-gj6dt" event={"ID":"a50dec5c-d013-42b7-8a60-c405d5c93362","Type":"ContainerStarted","Data":"79e7e3be0b7084487efb953a05edee78110e607fdba9d97ffe461391c9aa3bcb"} Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.939618 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"d501bb9c-d88d-4362-a48e-4d0347ecc90e","Type":"ContainerDied","Data":"d6051b0c5bd2d63d9f43ae131b460100c45ef28a76e95d9c82c1f29baab7429d"} Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.939669 4870 scope.go:117] "RemoveContainer" containerID="ac4b1626212791c0a56219dd9b2abaf2c7a3d91e287f4b28aad9d510e177e2dc" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.939809 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.950769 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerStarted","Data":"ea0f16d0885c4f830bf99a5da540d693451546a651ab72659db0f8c0dde59721"} Jan 30 08:29:18 crc kubenswrapper[4870]: I0130 08:29:18.967923 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerStarted","Data":"a557094499d086ce164562ed8ad45b4d04434481b6d30eb79016392157566f9b"} Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.049949 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.086944 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.105150 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: E0130 08:29:19.105540 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.105557 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:19 crc kubenswrapper[4870]: E0130 08:29:19.105599 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" containerName="init" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.105606 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" containerName="init" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.105775 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" containerName="watcher-applier" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.105795 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" containerName="init" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.106426 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.109419 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.109722 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.160294 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtgt\" (UniqueName: \"kubernetes.io/projected/4061e0b3-e3ae-4ef0-a979-6028df77da5c-kube-api-access-2vtgt\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.160543 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.160585 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4061e0b3-e3ae-4ef0-a979-6028df77da5c-logs\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.160731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-config-data\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.262240 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtgt\" (UniqueName: \"kubernetes.io/projected/4061e0b3-e3ae-4ef0-a979-6028df77da5c-kube-api-access-2vtgt\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.262651 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.262681 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4061e0b3-e3ae-4ef0-a979-6028df77da5c-logs\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.262770 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-config-data\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.264479 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4061e0b3-e3ae-4ef0-a979-6028df77da5c-logs\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.282422 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.283602 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-config-data\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.284096 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4061e0b3-e3ae-4ef0-a979-6028df77da5c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.301415 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtgt\" (UniqueName: \"kubernetes.io/projected/4061e0b3-e3ae-4ef0-a979-6028df77da5c-kube-api-access-2vtgt\") pod \"watcher-applier-0\" (UID: \"4061e0b3-e3ae-4ef0-a979-6028df77da5c\") " pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.399985 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.469576 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.505616 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.670795 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.766584 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:19 crc kubenswrapper[4870]: I0130 08:29:19.992295 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69bf9957-gj6dt" event={"ID":"a50dec5c-d013-42b7-8a60-c405d5c93362","Type":"ContainerStarted","Data":"682cb1d7ee584cf4bdfb29f68d57d3bf09cfb5067562aa6aa9e2cf5f1e08d2ee"} Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.023640 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="cinder-scheduler" containerID="cri-o://43e89a562f59d4e0caa713b4d3d7d10459a54c334d8bf93738ef4bfb17bc36b1" gracePeriod=30 Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.024672 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerStarted","Data":"cdbf7af2d49e2059c333123a290e9cf8cf7bad8409952cd6d195d8442d2d2794"} Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.024966 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="probe" containerID="cri-o://8eb8ea85632818a452e1edbc831618a751f9abde5e34ca94d13504592653fdb8" gracePeriod=30 Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.137209 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa" path="/var/lib/kubelet/pods/9b9285b0-ed06-4b59-9f79-a1b7a89ac0aa/volumes" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.138001 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d501bb9c-d88d-4362-a48e-4d0347ecc90e" path="/var/lib/kubelet/pods/d501bb9c-d88d-4362-a48e-4d0347ecc90e/volumes" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.178953 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.178935154 podStartE2EDuration="8.178935154s" podCreationTimestamp="2026-01-30 08:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:20.047435942 +0000 UTC m=+1198.742983071" watchObservedRunningTime="2026-01-30 08:29:20.178935154 +0000 UTC m=+1198.874482263" Jan 30 08:29:20 crc kubenswrapper[4870]: W0130 08:29:20.204008 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4061e0b3_e3ae_4ef0_a979_6028df77da5c.slice/crio-d0e619b7c1f9178e64d9b20d2250c41b27f6a0c4164e338c9e1b9a56d79380fa WatchSource:0}: Error finding container d0e619b7c1f9178e64d9b20d2250c41b27f6a0c4164e338c9e1b9a56d79380fa: Status 404 returned error can't find the container with id d0e619b7c1f9178e64d9b20d2250c41b27f6a0c4164e338c9e1b9a56d79380fa Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.209770 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.384401 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.387946 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.392327 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-427ds" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.392512 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.392643 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.403562 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.494809 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.494904 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.494953 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.495016 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpt2x\" (UniqueName: \"kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.596920 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.597315 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpt2x\" (UniqueName: \"kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.597394 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.597474 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.597754 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.601218 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.617330 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.617832 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpt2x\" (UniqueName: \"kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x\") pod \"openstackclient\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.762722 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.763465 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.793796 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.841937 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.843224 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.858789 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.906198 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-combined-ca-bundle\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.906314 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config-secret\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.906345 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:20 crc kubenswrapper[4870]: I0130 08:29:20.906401 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/204a0d39-f7b0-4468-a82f-9fcc49fc1281-kube-api-access-dvzdt\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.015244 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-combined-ca-bundle\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.015364 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config-secret\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.015385 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.015437 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/204a0d39-f7b0-4468-a82f-9fcc49fc1281-kube-api-access-dvzdt\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.020554 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.025708 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-openstack-config-secret\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.039811 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a0d39-f7b0-4468-a82f-9fcc49fc1281-combined-ca-bundle\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.071704 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzdt\" (UniqueName: \"kubernetes.io/projected/204a0d39-f7b0-4468-a82f-9fcc49fc1281-kube-api-access-dvzdt\") pod \"openstackclient\" (UID: \"204a0d39-f7b0-4468-a82f-9fcc49fc1281\") " pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.087057 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerStarted","Data":"fcef3a1004b84a8cdcd9a3aaa97b96b2c634246495cb7db1a07176496c9f5b70"} Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.087211 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-log" containerID="cri-o://a557094499d086ce164562ed8ad45b4d04434481b6d30eb79016392157566f9b" gracePeriod=30 Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.087533 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-httpd" containerID="cri-o://fcef3a1004b84a8cdcd9a3aaa97b96b2c634246495cb7db1a07176496c9f5b70" gracePeriod=30 Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.124263 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4061e0b3-e3ae-4ef0-a979-6028df77da5c","Type":"ContainerStarted","Data":"d2981178746e59630fa9af78b555b16f88ae090d539ab38dace0e408ac984122"} Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.124305 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4061e0b3-e3ae-4ef0-a979-6028df77da5c","Type":"ContainerStarted","Data":"d0e619b7c1f9178e64d9b20d2250c41b27f6a0c4164e338c9e1b9a56d79380fa"} Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.147392 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.147365131 podStartE2EDuration="9.147365131s" podCreationTimestamp="2026-01-30 08:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:21.125210215 +0000 UTC m=+1199.820757324" watchObservedRunningTime="2026-01-30 08:29:21.147365131 +0000 UTC m=+1199.842912240" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.149259 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-log" containerID="cri-o://2d530828db1e13fb59b8a6f3bf5f6b711b1aef5265c44e2042172678ae798ad8" gracePeriod=30 Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.149571 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d69bf9957-gj6dt" event={"ID":"a50dec5c-d013-42b7-8a60-c405d5c93362","Type":"ContainerStarted","Data":"7344b11ec3ba1aa5689326e47ed586eecca8d1c99a0b89bf83ead8f26faf7332"} Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.149725 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-httpd" containerID="cri-o://cdbf7af2d49e2059c333123a290e9cf8cf7bad8409952cd6d195d8442d2d2794" gracePeriod=30 Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.150208 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.174497 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d69bf9957-gj6dt" podStartSLOduration=4.174481752 podStartE2EDuration="4.174481752s" podCreationTimestamp="2026-01-30 08:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:21.172835031 +0000 UTC m=+1199.868382140" watchObservedRunningTime="2026-01-30 08:29:21.174481752 +0000 UTC m=+1199.870028861" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.190125 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.184986302 podStartE2EDuration="3.184986302s" podCreationTimestamp="2026-01-30 08:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:21.149778556 +0000 UTC m=+1199.845325665" watchObservedRunningTime="2026-01-30 08:29:21.184986302 +0000 UTC m=+1199.880533411" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.197351 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.651490 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:21 crc kubenswrapper[4870]: I0130 08:29:21.786040 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.191227 4870 generic.go:334] "Generic (PLEG): container finished" podID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerID="8eb8ea85632818a452e1edbc831618a751f9abde5e34ca94d13504592653fdb8" exitCode=0 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.191470 4870 generic.go:334] "Generic (PLEG): container finished" podID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerID="43e89a562f59d4e0caa713b4d3d7d10459a54c334d8bf93738ef4bfb17bc36b1" exitCode=0 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.191506 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerDied","Data":"8eb8ea85632818a452e1edbc831618a751f9abde5e34ca94d13504592653fdb8"} Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.191529 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerDied","Data":"43e89a562f59d4e0caa713b4d3d7d10459a54c334d8bf93738ef4bfb17bc36b1"} Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.193568 4870 generic.go:334] "Generic (PLEG): container finished" podID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerID="fcef3a1004b84a8cdcd9a3aaa97b96b2c634246495cb7db1a07176496c9f5b70" exitCode=0 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.193599 4870 generic.go:334] "Generic (PLEG): container finished" podID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerID="a557094499d086ce164562ed8ad45b4d04434481b6d30eb79016392157566f9b" exitCode=143 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.193638 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerDied","Data":"fcef3a1004b84a8cdcd9a3aaa97b96b2c634246495cb7db1a07176496c9f5b70"} Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.193665 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerDied","Data":"a557094499d086ce164562ed8ad45b4d04434481b6d30eb79016392157566f9b"} Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.201906 4870 generic.go:334] "Generic (PLEG): container finished" podID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerID="cdbf7af2d49e2059c333123a290e9cf8cf7bad8409952cd6d195d8442d2d2794" exitCode=0 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.201937 4870 generic.go:334] "Generic (PLEG): container finished" podID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerID="2d530828db1e13fb59b8a6f3bf5f6b711b1aef5265c44e2042172678ae798ad8" exitCode=143 Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.202774 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerDied","Data":"cdbf7af2d49e2059c333123a290e9cf8cf7bad8409952cd6d195d8442d2d2794"} Jan 30 08:29:22 crc kubenswrapper[4870]: I0130 08:29:22.202799 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerDied","Data":"2d530828db1e13fb59b8a6f3bf5f6b711b1aef5265c44e2042172678ae798ad8"} Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.226110 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.317801 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.326641 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.353168 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="dnsmasq-dns" containerID="cri-o://893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb" gracePeriod=10 Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.475347 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484005 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484461 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfc5p\" (UniqueName: \"kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484500 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484533 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484662 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484702 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484800 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.484815 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run\") pod \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\" (UID: \"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.485602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.498227 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs" (OuterVolumeSpecName: "logs") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.505079 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts" (OuterVolumeSpecName: "scripts") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.509121 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p" (OuterVolumeSpecName: "kube-api-access-qfc5p") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "kube-api-access-qfc5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.515039 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586064 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586345 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcs4s\" (UniqueName: \"kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586465 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586573 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586651 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586745 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586900 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.586993 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.587072 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx4qh\" (UniqueName: \"kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.587238 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.587312 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.587398 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.587484 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle\") pod \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\" (UID: \"9f83ba22-1075-4159-b19d-f0b9ceec4ac3\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589015 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfc5p\" (UniqueName: \"kubernetes.io/projected/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-kube-api-access-qfc5p\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589118 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589189 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589245 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589308 4870 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.589510 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.595190 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.599233 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs" (OuterVolumeSpecName: "logs") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.605059 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.622079 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.633831 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts" (OuterVolumeSpecName: "scripts") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.637030 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s" (OuterVolumeSpecName: "kube-api-access-wcs4s") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "kube-api-access-wcs4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.637150 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh" (OuterVolumeSpecName: "kube-api-access-tx4qh") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "kube-api-access-tx4qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.646013 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data" (OuterVolumeSpecName: "config-data") pod "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" (UID: "f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.646130 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.659014 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts" (OuterVolumeSpecName: "scripts") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.667268 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.669076 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692083 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692118 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692146 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692157 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692167 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx4qh\" (UniqueName: \"kubernetes.io/projected/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-kube-api-access-tx4qh\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692178 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692187 4870 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f839b4e9-f9f0-489d-b04b-14b03ab6895b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692194 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692202 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcs4s\" (UniqueName: \"kubernetes.io/projected/f839b4e9-f9f0-489d-b04b-14b03ab6895b-kube-api-access-wcs4s\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692209 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692217 4870 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.692224 4870 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.755141 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.766602 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.792280 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data" (OuterVolumeSpecName: "config-data") pod "9f83ba22-1075-4159-b19d-f0b9ceec4ac3" (UID: "9f83ba22-1075-4159-b19d-f0b9ceec4ac3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.798035 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.811147 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") pod \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\" (UID: \"f839b4e9-f9f0-489d-b04b-14b03ab6895b\") " Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.811797 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.811815 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.811827 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f83ba22-1075-4159-b19d-f0b9ceec4ac3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: W0130 08:29:23.811917 4870 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f839b4e9-f9f0-489d-b04b-14b03ab6895b/volumes/kubernetes.io~secret/combined-ca-bundle Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.811929 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: E0130 08:29:23.848133 4870 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 08:29:23 crc kubenswrapper[4870]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1a63902b-36ef-479e-8124-86f7a7f3f8db_0(a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67" Netns:"/var/run/netns/75082fd7-70b5-4f46-b731-308218064a72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67;K8S_POD_UID=1a63902b-36ef-479e-8124-86f7a7f3f8db" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1a63902b-36ef-479e-8124-86f7a7f3f8db]: expected pod UID "1a63902b-36ef-479e-8124-86f7a7f3f8db" but got "204a0d39-f7b0-4468-a82f-9fcc49fc1281" from Kube API Jan 30 08:29:23 crc kubenswrapper[4870]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 08:29:23 crc kubenswrapper[4870]: > Jan 30 08:29:23 crc kubenswrapper[4870]: E0130 08:29:23.848208 4870 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 08:29:23 crc kubenswrapper[4870]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1a63902b-36ef-479e-8124-86f7a7f3f8db_0(a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67" Netns:"/var/run/netns/75082fd7-70b5-4f46-b731-308218064a72" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a4ead0b94abe15dba32bbce83dc44d7770575e44e2d3952e71c8eacb73dcbb67;K8S_POD_UID=1a63902b-36ef-479e-8124-86f7a7f3f8db" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1a63902b-36ef-479e-8124-86f7a7f3f8db]: expected pod UID "1a63902b-36ef-479e-8124-86f7a7f3f8db" but got "204a0d39-f7b0-4468-a82f-9fcc49fc1281" from Kube API Jan 30 08:29:23 crc kubenswrapper[4870]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 08:29:23 crc kubenswrapper[4870]: > pod="openstack/openstackclient" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.897011 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data" (OuterVolumeSpecName: "config-data") pod "f839b4e9-f9f0-489d-b04b-14b03ab6895b" (UID: "f839b4e9-f9f0-489d-b04b-14b03ab6895b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.914508 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:23 crc kubenswrapper[4870]: I0130 08:29:23.914540 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f839b4e9-f9f0-489d-b04b-14b03ab6895b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.055409 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.115734 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227633 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227667 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227745 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227860 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227912 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7zw7\" (UniqueName: \"kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.227940 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config\") pod \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\" (UID: \"7fc2a1f3-54bc-4554-a413-69bc35b58a2f\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.236743 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"204a0d39-f7b0-4468-a82f-9fcc49fc1281","Type":"ContainerStarted","Data":"9bd41cb2f723a62f616dfdcd31d5ddca0f265559ea5e020a6d33f3e5a2804f90"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.236782 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7" (OuterVolumeSpecName: "kube-api-access-r7zw7") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "kube-api-access-r7zw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.250390 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9f83ba22-1075-4159-b19d-f0b9ceec4ac3","Type":"ContainerDied","Data":"8eeee39b66f7aa76b98a5db5ced5f5152684b0c7a60879e1f3f2cf389449b18e"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.250414 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.250436 4870 scope.go:117] "RemoveContainer" containerID="fcef3a1004b84a8cdcd9a3aaa97b96b2c634246495cb7db1a07176496c9f5b70" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.256641 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0","Type":"ContainerDied","Data":"fab1f41c1ff636465358dde7b3c49c985eff8cd0920b9b81dab7f18f354ae31d"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.256717 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.263274 4870 generic.go:334] "Generic (PLEG): container finished" podID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerID="893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb" exitCode=0 Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.263331 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" event={"ID":"7fc2a1f3-54bc-4554-a413-69bc35b58a2f","Type":"ContainerDied","Data":"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.263355 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" event={"ID":"7fc2a1f3-54bc-4554-a413-69bc35b58a2f","Type":"ContainerDied","Data":"7e31fa7de08f6fe5e038c395ec730d87de7703f0769687addc3ef103068cb495"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.263410 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb4d475f-66fsw" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.268798 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f839b4e9-f9f0-489d-b04b-14b03ab6895b","Type":"ContainerDied","Data":"104aebc0ebfaddceac864d04e025d33e5cf5c1cacba0ac14afe67f72e174ead6"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.268897 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.278913 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.279983 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerStarted","Data":"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.280038 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerStarted","Data":"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36"} Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.313646 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.322912 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.330157 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.330181 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.330192 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7zw7\" (UniqueName: \"kubernetes.io/projected/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-kube-api-access-r7zw7\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.330660 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.353055 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5564cc7ccb-wnwrs"] Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.353930 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="init" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.353943 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="init" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.353971 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="cinder-scheduler" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.353980 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="cinder-scheduler" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.353995 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354001 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.354022 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354028 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.354057 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="probe" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354063 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="probe" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.354077 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354086 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.354094 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354101 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: E0130 08:29:24.354125 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="dnsmasq-dns" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354133 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="dnsmasq-dns" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354449 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354461 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-log" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354476 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" containerName="dnsmasq-dns" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354493 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="probe" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354515 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354526 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" containerName="glance-httpd" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.354545 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" containerName="cinder-scheduler" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.392325 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.407176 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5564cc7ccb-wnwrs"] Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.409085 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.410380 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config" (OuterVolumeSpecName: "config") pod "7fc2a1f3-54bc-4554-a413-69bc35b58a2f" (UID: "7fc2a1f3-54bc-4554-a413-69bc35b58a2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.413394 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.429327 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.438792 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.438894 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.438914 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc2a1f3-54bc-4554-a413-69bc35b58a2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.507155 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540434 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zfh\" (UniqueName: \"kubernetes.io/projected/304a486b-b7cf-4418-82c9-7795b2331284-kube-api-access-b8zfh\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540484 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-public-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540576 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data-custom\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540602 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-combined-ca-bundle\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540645 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540670 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-internal-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.540705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/304a486b-b7cf-4418-82c9-7795b2331284-logs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.644821 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data-custom\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.644873 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-combined-ca-bundle\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.644934 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.644967 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-internal-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.645002 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/304a486b-b7cf-4418-82c9-7795b2331284-logs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.645035 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zfh\" (UniqueName: \"kubernetes.io/projected/304a486b-b7cf-4418-82c9-7795b2331284-kube-api-access-b8zfh\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.645058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-public-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.646482 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data-custom\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.646774 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/304a486b-b7cf-4418-82c9-7795b2331284-logs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.650528 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-internal-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.651246 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-config-data\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.651713 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-combined-ca-bundle\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.652608 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/304a486b-b7cf-4418-82c9-7795b2331284-public-tls-certs\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.683753 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zfh\" (UniqueName: \"kubernetes.io/projected/304a486b-b7cf-4418-82c9-7795b2331284-kube-api-access-b8zfh\") pod \"barbican-api-5564cc7ccb-wnwrs\" (UID: \"304a486b-b7cf-4418-82c9-7795b2331284\") " pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.710919 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74569d8966-5sjxs" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.803052 4870 scope.go:117] "RemoveContainer" containerID="a557094499d086ce164562ed8ad45b4d04434481b6d30eb79016392157566f9b" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.814587 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.826351 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1a63902b-36ef-479e-8124-86f7a7f3f8db" podUID="204a0d39-f7b0-4468-a82f-9fcc49fc1281" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.852262 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret\") pod \"1a63902b-36ef-479e-8124-86f7a7f3f8db\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.852314 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config\") pod \"1a63902b-36ef-479e-8124-86f7a7f3f8db\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.852369 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle\") pod \"1a63902b-36ef-479e-8124-86f7a7f3f8db\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.852399 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpt2x\" (UniqueName: \"kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x\") pod \"1a63902b-36ef-479e-8124-86f7a7f3f8db\" (UID: \"1a63902b-36ef-479e-8124-86f7a7f3f8db\") " Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.854164 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1a63902b-36ef-479e-8124-86f7a7f3f8db" (UID: "1a63902b-36ef-479e-8124-86f7a7f3f8db"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.865626 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x" (OuterVolumeSpecName: "kube-api-access-cpt2x") pod "1a63902b-36ef-479e-8124-86f7a7f3f8db" (UID: "1a63902b-36ef-479e-8124-86f7a7f3f8db"). InnerVolumeSpecName "kube-api-access-cpt2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.870085 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a63902b-36ef-479e-8124-86f7a7f3f8db" (UID: "1a63902b-36ef-479e-8124-86f7a7f3f8db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.884670 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1a63902b-36ef-479e-8124-86f7a7f3f8db" (UID: "1a63902b-36ef-479e-8124-86f7a7f3f8db"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.955760 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.955796 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1a63902b-36ef-479e-8124-86f7a7f3f8db-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.955805 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a63902b-36ef-479e-8124-86f7a7f3f8db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.955816 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpt2x\" (UniqueName: \"kubernetes.io/projected/1a63902b-36ef-479e-8124-86f7a7f3f8db-kube-api-access-cpt2x\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:24 crc kubenswrapper[4870]: I0130 08:29:24.967870 4870 scope.go:117] "RemoveContainer" containerID="cdbf7af2d49e2059c333123a290e9cf8cf7bad8409952cd6d195d8442d2d2794" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.023106 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.032030 4870 scope.go:117] "RemoveContainer" containerID="2d530828db1e13fb59b8a6f3bf5f6b711b1aef5265c44e2042172678ae798ad8" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.047374 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.051536 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.071538 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.085949 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.095521 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.097018 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.113398 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.117404 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.122581 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fbb4d475f-66fsw"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.134052 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.146448 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.157899 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160488 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160552 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160604 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160677 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160711 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.160727 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56k4\" (UniqueName: \"kubernetes.io/projected/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-kube-api-access-r56k4\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.173402 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.175090 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.180267 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.180337 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.180565 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.180720 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-58ht6" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.182472 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.207921 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.209438 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.218920 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.219094 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.226078 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.264747 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.264808 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.264835 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r56k4\" (UniqueName: \"kubernetes.io/projected/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-kube-api-access-r56k4\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.264947 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.264991 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.265043 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.265271 4870 scope.go:117] "RemoveContainer" containerID="893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.266997 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.278473 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.278592 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.288701 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.288750 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.293364 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r56k4\" (UniqueName: \"kubernetes.io/projected/e7a1bbc0-d212-4a83-bea0-d40c261ddb18-kube-api-access-r56k4\") pod \"cinder-scheduler-0\" (UID: \"e7a1bbc0-d212-4a83-bea0-d40c261ddb18\") " pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.322688 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.349162 4870 scope.go:117] "RemoveContainer" containerID="38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366492 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366535 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366594 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366612 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366633 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366656 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366674 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366693 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366734 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.366942 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46d96\" (UniqueName: \"kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.367316 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp296\" (UniqueName: \"kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.367339 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.367406 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.367422 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.367451 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.369254 4870 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1a63902b-36ef-479e-8124-86f7a7f3f8db" podUID="204a0d39-f7b0-4468-a82f-9fcc49fc1281" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.402300 4870 scope.go:117] "RemoveContainer" containerID="893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb" Jan 30 08:29:25 crc kubenswrapper[4870]: E0130 08:29:25.411076 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb\": container with ID starting with 893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb not found: ID does not exist" containerID="893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.411131 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb"} err="failed to get container status \"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb\": rpc error: code = NotFound desc = could not find container \"893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb\": container with ID starting with 893bc6818effa10ecee42122ccd6f6afc59777130894069439f5549f0dfabacb not found: ID does not exist" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.411158 4870 scope.go:117] "RemoveContainer" containerID="38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c" Jan 30 08:29:25 crc kubenswrapper[4870]: E0130 08:29:25.414732 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c\": container with ID starting with 38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c not found: ID does not exist" containerID="38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.414762 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c"} err="failed to get container status \"38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c\": rpc error: code = NotFound desc = could not find container \"38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c\": container with ID starting with 38a1debf6b26f10553fce6710f3f79987242363403d1aac4313b2a726405eb4c not found: ID does not exist" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.414800 4870 scope.go:117] "RemoveContainer" containerID="8eb8ea85632818a452e1edbc831618a751f9abde5e34ca94d13504592653fdb8" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469113 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469217 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46d96\" (UniqueName: \"kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469254 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp296\" (UniqueName: \"kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469271 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469309 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469327 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469366 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469418 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469448 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469485 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469517 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469536 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469553 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.469576 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.470153 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.470439 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.471332 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.471431 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.472410 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.481713 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.484247 4870 scope.go:117] "RemoveContainer" containerID="43e89a562f59d4e0caa713b4d3d7d10459a54c334d8bf93738ef4bfb17bc36b1" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.484974 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.488241 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.496488 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.497451 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.498066 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.498480 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.499031 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.505461 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp296\" (UniqueName: \"kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.508953 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.511118 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46d96\" (UniqueName: \"kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.520119 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.545430 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.590530 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.621488 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.637615 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:25 crc kubenswrapper[4870]: I0130 08:29:25.820565 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5564cc7ccb-wnwrs"] Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.098091 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a63902b-36ef-479e-8124-86f7a7f3f8db" path="/var/lib/kubelet/pods/1a63902b-36ef-479e-8124-86f7a7f3f8db/volumes" Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.098569 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc2a1f3-54bc-4554-a413-69bc35b58a2f" path="/var/lib/kubelet/pods/7fc2a1f3-54bc-4554-a413-69bc35b58a2f/volumes" Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.105004 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f83ba22-1075-4159-b19d-f0b9ceec4ac3" path="/var/lib/kubelet/pods/9f83ba22-1075-4159-b19d-f0b9ceec4ac3/volumes" Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.109068 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0" path="/var/lib/kubelet/pods/f7ab1ad2-11f4-4117-a7c9-9fbfe6e4cbc0/volumes" Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.109726 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f839b4e9-f9f0-489d-b04b-14b03ab6895b" path="/var/lib/kubelet/pods/f839b4e9-f9f0-489d-b04b-14b03ab6895b/volumes" Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.318661 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.360439 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerStarted","Data":"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08"} Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.369607 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5564cc7ccb-wnwrs" event={"ID":"304a486b-b7cf-4418-82c9-7795b2331284","Type":"ContainerStarted","Data":"f7e5f6b0e4f969e3e5827d9a0ddf7fe62514654e337c73550a66c3dccfc2ec73"} Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.369680 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5564cc7ccb-wnwrs" event={"ID":"304a486b-b7cf-4418-82c9-7795b2331284","Type":"ContainerStarted","Data":"79c4f3c21771817b6dd7e4cec0be32db1ab83c3de292da2f51b729907cc29073"} Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.551969 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:26 crc kubenswrapper[4870]: I0130 08:29:26.650850 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.384138 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a1bbc0-d212-4a83-bea0-d40c261ddb18","Type":"ContainerStarted","Data":"2353d39e0d2cc696c9a74d198b852c6958c449f8e29f1bc712e4ca7a874c28b7"} Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.388474 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerStarted","Data":"efd4740849ad794d4e57051943e44782204bcc76846a706c3d143892f1a69cb3"} Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.401542 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5564cc7ccb-wnwrs" event={"ID":"304a486b-b7cf-4418-82c9-7795b2331284","Type":"ContainerStarted","Data":"fd619663bd6a9506b679253344163273ee3ee7b2cf9826d6c161e14947ad6cde"} Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.402633 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.402675 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.404442 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerStarted","Data":"eda82349da897444026066edb7a8f71a1933756e2aef786c074692bf323e90ef"} Jan 30 08:29:27 crc kubenswrapper[4870]: I0130 08:29:27.427372 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5564cc7ccb-wnwrs" podStartSLOduration=3.427353742 podStartE2EDuration="3.427353742s" podCreationTimestamp="2026-01-30 08:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:27.426101782 +0000 UTC m=+1206.121648891" watchObservedRunningTime="2026-01-30 08:29:27.427353742 +0000 UTC m=+1206.122900851" Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.217977 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.218554 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.219331 4870 scope.go:117] "RemoveContainer" containerID="1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42" Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.424361 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerStarted","Data":"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe"} Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.430097 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerStarted","Data":"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7"} Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.430242 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.432163 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a1bbc0-d212-4a83-bea0-d40c261ddb18","Type":"ContainerStarted","Data":"33e244873a0e2a71b8bf3785a2b8928bcd1c3e50dc1af9364a6614c001d155b0"} Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.434342 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerStarted","Data":"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1"} Jan 30 08:29:28 crc kubenswrapper[4870]: I0130 08:29:28.459186 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.470738873 podStartE2EDuration="12.45916508s" podCreationTimestamp="2026-01-30 08:29:16 +0000 UTC" firstStartedPulling="2026-01-30 08:29:18.185244124 +0000 UTC m=+1196.880791233" lastFinishedPulling="2026-01-30 08:29:27.173670331 +0000 UTC m=+1205.869217440" observedRunningTime="2026-01-30 08:29:28.452796219 +0000 UTC m=+1207.148343328" watchObservedRunningTime="2026-01-30 08:29:28.45916508 +0000 UTC m=+1207.154712199" Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.452783 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerStarted","Data":"bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba"} Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.455220 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerStarted","Data":"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5"} Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.456858 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a1bbc0-d212-4a83-bea0-d40c261ddb18","Type":"ContainerStarted","Data":"b5de64248c4680443dbd47aee692a183cc11004db6a62b46d676989b11c3d021"} Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.462286 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerStarted","Data":"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18"} Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.508217 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.530300 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.530276352 podStartE2EDuration="4.530276352s" podCreationTimestamp="2026-01-30 08:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:29.51616901 +0000 UTC m=+1208.211716119" watchObservedRunningTime="2026-01-30 08:29:29.530276352 +0000 UTC m=+1208.225823461" Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.560772 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.560754741 podStartE2EDuration="4.560754741s" podCreationTimestamp="2026-01-30 08:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:29.544943244 +0000 UTC m=+1208.240490353" watchObservedRunningTime="2026-01-30 08:29:29.560754741 +0000 UTC m=+1208.256301850" Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.578469 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Jan 30 08:29:29 crc kubenswrapper[4870]: I0130 08:29:29.581839 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.581821702 podStartE2EDuration="4.581821702s" podCreationTimestamp="2026-01-30 08:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:29.574941257 +0000 UTC m=+1208.270488356" watchObservedRunningTime="2026-01-30 08:29:29.581821702 +0000 UTC m=+1208.277368811" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.356912 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.362758 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.365777 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56cfc8cc98-pfz9w" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429457 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429498 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429526 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdz8p\" (UniqueName: \"kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429584 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429754 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.429817 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key\") pod \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\" (UID: \"1872a14d-aeff-46f7-8430-c6fe0eb6973b\") " Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.435151 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p" (OuterVolumeSpecName: "kube-api-access-fdz8p") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "kube-api-access-fdz8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.435414 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs" (OuterVolumeSpecName: "logs") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.446121 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.467554 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts" (OuterVolumeSpecName: "scripts") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.473561 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data" (OuterVolumeSpecName: "config-data") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.492697 4870 generic.go:334] "Generic (PLEG): container finished" podID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerID="43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25" exitCode=137 Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.492706 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.492808 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74569d8966-5sjxs" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.492827 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerDied","Data":"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25"} Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.496482 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74569d8966-5sjxs" event={"ID":"1872a14d-aeff-46f7-8430-c6fe0eb6973b","Type":"ContainerDied","Data":"f7250a53827f362fa55ae4df1436ef860d73a09ab3dfd65756154cbbf24973a7"} Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.496513 4870 scope.go:117] "RemoveContainer" containerID="960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532030 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1872a14d-aeff-46f7-8430-c6fe0eb6973b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532287 4870 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532296 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532307 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1872a14d-aeff-46f7-8430-c6fe0eb6973b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532316 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdz8p\" (UniqueName: \"kubernetes.io/projected/1872a14d-aeff-46f7-8430-c6fe0eb6973b-kube-api-access-fdz8p\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.532325 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.543772 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1872a14d-aeff-46f7-8430-c6fe0eb6973b" (UID: "1872a14d-aeff-46f7-8430-c6fe0eb6973b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.575053 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.592788 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.634439 4870 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1872a14d-aeff-46f7-8430-c6fe0eb6973b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.688207 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-847c478677-wtndf"] Jan 30 08:29:30 crc kubenswrapper[4870]: E0130 08:29:30.688602 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.688619 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" Jan 30 08:29:30 crc kubenswrapper[4870]: E0130 08:29:30.688635 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon-log" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.688643 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon-log" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.688823 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon-log" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.688860 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" containerName="horizon" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.692838 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.697422 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.697496 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.701430 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.705626 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847c478677-wtndf"] Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.733042 4870 scope.go:117] "RemoveContainer" containerID="43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739015 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-internal-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739261 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-config-data\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739287 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-etc-swift\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739340 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2xq\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-kube-api-access-hc2xq\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739360 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-combined-ca-bundle\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739375 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-public-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739454 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-log-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.739474 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-run-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.807266 4870 scope.go:117] "RemoveContainer" containerID="960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3" Jan 30 08:29:30 crc kubenswrapper[4870]: E0130 08:29:30.808115 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3\": container with ID starting with 960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3 not found: ID does not exist" containerID="960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.808152 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3"} err="failed to get container status \"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3\": rpc error: code = NotFound desc = could not find container \"960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3\": container with ID starting with 960b2516ed61848524349c99edfb46a19e044030828811dfd8fbc7164a66aca3 not found: ID does not exist" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.808178 4870 scope.go:117] "RemoveContainer" containerID="43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25" Jan 30 08:29:30 crc kubenswrapper[4870]: E0130 08:29:30.812015 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25\": container with ID starting with 43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25 not found: ID does not exist" containerID="43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.812060 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25"} err="failed to get container status \"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25\": rpc error: code = NotFound desc = could not find container \"43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25\": container with ID starting with 43897c9b161d8b60a5a17a99eb44b17b490c36c5693b312f71be7235a7fc2b25 not found: ID does not exist" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.837957 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.840794 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2xq\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-kube-api-access-hc2xq\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.840849 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-combined-ca-bundle\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.840873 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-public-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-log-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841032 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-run-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841126 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-internal-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841168 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-config-data\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841192 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-etc-swift\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841634 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-run-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.841896 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c01b58ab-bb54-448b-83de-f70f08378751-log-httpd\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.846977 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-internal-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.848465 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-config-data\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.850392 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-public-tls-certs\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.851324 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-etc-swift\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.853616 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01b58ab-bb54-448b-83de-f70f08378751-combined-ca-bundle\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.856087 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74569d8966-5sjxs"] Jan 30 08:29:30 crc kubenswrapper[4870]: I0130 08:29:30.857793 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2xq\" (UniqueName: \"kubernetes.io/projected/c01b58ab-bb54-448b-83de-f70f08378751-kube-api-access-hc2xq\") pod \"swift-proxy-847c478677-wtndf\" (UID: \"c01b58ab-bb54-448b-83de-f70f08378751\") " pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:31 crc kubenswrapper[4870]: I0130 08:29:31.033331 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:31 crc kubenswrapper[4870]: I0130 08:29:31.762109 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847c478677-wtndf"] Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.100328 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1872a14d-aeff-46f7-8430-c6fe0eb6973b" path="/var/lib/kubelet/pods/1872a14d-aeff-46f7-8430-c6fe0eb6973b/volumes" Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.533021 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847c478677-wtndf" event={"ID":"c01b58ab-bb54-448b-83de-f70f08378751","Type":"ContainerStarted","Data":"6e94f5f8fccf94dbe96dda526963c5ae06002d1ed02a8fdb5abbcd121fc34708"} Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.533327 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847c478677-wtndf" event={"ID":"c01b58ab-bb54-448b-83de-f70f08378751","Type":"ContainerStarted","Data":"d37875ada57cad08b3b77c2e55d1001af08440ce1b1712105d5ba2117cae59f6"} Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.533339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847c478677-wtndf" event={"ID":"c01b58ab-bb54-448b-83de-f70f08378751","Type":"ContainerStarted","Data":"34c5c1b354646482c74b5ebbf08ef3e5f83e7b9bbb0b5a32ae883cfed8df6540"} Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.533650 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.533740 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:32 crc kubenswrapper[4870]: I0130 08:29:32.558472 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-847c478677-wtndf" podStartSLOduration=2.558456815 podStartE2EDuration="2.558456815s" podCreationTimestamp="2026-01-30 08:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:32.553592753 +0000 UTC m=+1211.249139862" watchObservedRunningTime="2026-01-30 08:29:32.558456815 +0000 UTC m=+1211.254003924" Jan 30 08:29:33 crc kubenswrapper[4870]: I0130 08:29:33.544366 4870 generic.go:334] "Generic (PLEG): container finished" podID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" exitCode=1 Jan 30 08:29:33 crc kubenswrapper[4870]: I0130 08:29:33.544475 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerDied","Data":"bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba"} Jan 30 08:29:33 crc kubenswrapper[4870]: I0130 08:29:33.544871 4870 scope.go:117] "RemoveContainer" containerID="1368b589787d7b14188bdd5cbf6d5c41177fec471c90618303e481623b136b42" Jan 30 08:29:33 crc kubenswrapper[4870]: I0130 08:29:33.546123 4870 scope.go:117] "RemoveContainer" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:29:33 crc kubenswrapper[4870]: E0130 08:29:33.546390 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8628af25-d5e4-46a0-adec-4c25ca39676b)\"" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.096672 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.117455 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-central-agent" containerID="cri-o://a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36" gracePeriod=30 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.117681 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="sg-core" containerID="cri-o://d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08" gracePeriod=30 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.117671 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="proxy-httpd" containerID="cri-o://655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7" gracePeriod=30 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.117716 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-notification-agent" containerID="cri-o://532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac" gracePeriod=30 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577706 4870 generic.go:334] "Generic (PLEG): container finished" podID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerID="655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7" exitCode=0 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577741 4870 generic.go:334] "Generic (PLEG): container finished" podID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerID="d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08" exitCode=2 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577767 4870 generic.go:334] "Generic (PLEG): container finished" podID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerID="a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36" exitCode=0 Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577819 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerDied","Data":"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7"} Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577850 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerDied","Data":"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08"} Jan 30 08:29:34 crc kubenswrapper[4870]: I0130 08:29:34.577863 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerDied","Data":"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36"} Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.052267 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165060 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165123 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165206 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165238 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165279 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165312 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165339 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jkw\" (UniqueName: \"kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw\") pod \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\" (UID: \"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f\") " Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.165910 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.167738 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.172310 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw" (OuterVolumeSpecName: "kube-api-access-q6jkw") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "kube-api-access-q6jkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.172313 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts" (OuterVolumeSpecName: "scripts") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.219626 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.269073 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.269217 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.269228 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.269236 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.269245 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6jkw\" (UniqueName: \"kubernetes.io/projected/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-kube-api-access-q6jkw\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.309978 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data" (OuterVolumeSpecName: "config-data") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.334013 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" (UID: "bebd196d-f8e4-466e-aa1f-99a65e3c7c6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.372317 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.372347 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.595065 4870 generic.go:334] "Generic (PLEG): container finished" podID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerID="532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac" exitCode=0 Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.595105 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerDied","Data":"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac"} Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.595131 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bebd196d-f8e4-466e-aa1f-99a65e3c7c6f","Type":"ContainerDied","Data":"ea0f16d0885c4f830bf99a5da540d693451546a651ab72659db0f8c0dde59721"} Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.595148 4870 scope.go:117] "RemoveContainer" containerID="655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.595177 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.623002 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.623057 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.637397 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.644223 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.644260 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.670943 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682034 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:35 crc kubenswrapper[4870]: E0130 08:29:35.682500 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="proxy-httpd" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682514 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="proxy-httpd" Jan 30 08:29:35 crc kubenswrapper[4870]: E0130 08:29:35.682530 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="sg-core" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682538 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="sg-core" Jan 30 08:29:35 crc kubenswrapper[4870]: E0130 08:29:35.682565 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-notification-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682571 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-notification-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: E0130 08:29:35.682580 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-central-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682586 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-central-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682763 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-notification-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682775 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="sg-core" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682788 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="ceilometer-central-agent" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.682808 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" containerName="proxy-httpd" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.684507 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.684609 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.687087 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.687561 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.697028 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.712829 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.739073 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.755503 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.781228 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783370 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783580 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783732 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783863 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6ng\" (UniqueName: \"kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783907 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.783990 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.834728 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885399 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885478 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6ng\" (UniqueName: \"kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885503 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885550 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885581 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885617 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.885679 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.888259 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.888561 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.890663 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.892005 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.892285 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.902339 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:35 crc kubenswrapper[4870]: I0130 08:29:35.902944 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6ng\" (UniqueName: \"kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng\") pod \"ceilometer-0\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " pod="openstack/ceilometer-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.040109 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.090051 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebd196d-f8e4-466e-aa1f-99a65e3c7c6f" path="/var/lib/kubelet/pods/bebd196d-f8e4-466e-aa1f-99a65e3c7c6f/volumes" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.516514 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.606360 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.606407 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.606530 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.607292 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:36 crc kubenswrapper[4870]: I0130 08:29:36.766011 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.229848 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5564cc7ccb-wnwrs" Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.329283 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.329700 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" containerID="cri-o://4bfee3b8db156e8f632e1b810fed5fff1f01f89361fe371e88524396b0b3e740" gracePeriod=30 Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.330154 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" containerID="cri-o://9a536688d050dc6432091788f5363412218a7bb425a3e180118973e359516afe" gracePeriod=30 Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.630397 4870 generic.go:334] "Generic (PLEG): container finished" podID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerID="4bfee3b8db156e8f632e1b810fed5fff1f01f89361fe371e88524396b0b3e740" exitCode=143 Jan 30 08:29:37 crc kubenswrapper[4870]: I0130 08:29:37.631244 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerDied","Data":"4bfee3b8db156e8f632e1b810fed5fff1f01f89361fe371e88524396b0b3e740"} Jan 30 08:29:38 crc kubenswrapper[4870]: I0130 08:29:38.217896 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:38 crc kubenswrapper[4870]: I0130 08:29:38.217938 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:38 crc kubenswrapper[4870]: I0130 08:29:38.218587 4870 scope.go:117] "RemoveContainer" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:29:38 crc kubenswrapper[4870]: E0130 08:29:38.218814 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8628af25-d5e4-46a0-adec-4c25ca39676b)\"" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" Jan 30 08:29:38 crc kubenswrapper[4870]: I0130 08:29:38.640377 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:29:38 crc kubenswrapper[4870]: I0130 08:29:38.640709 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:29:39 crc kubenswrapper[4870]: I0130 08:29:39.259404 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": dial tcp 10.217.0.175:9311: connect: connection refused" Jan 30 08:29:39 crc kubenswrapper[4870]: I0130 08:29:39.259411 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": dial tcp 10.217.0.175:9311: connect: connection refused" Jan 30 08:29:39 crc kubenswrapper[4870]: I0130 08:29:39.653163 4870 generic.go:334] "Generic (PLEG): container finished" podID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerID="9a536688d050dc6432091788f5363412218a7bb425a3e180118973e359516afe" exitCode=0 Jan 30 08:29:39 crc kubenswrapper[4870]: I0130 08:29:39.653216 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerDied","Data":"9a536688d050dc6432091788f5363412218a7bb425a3e180118973e359516afe"} Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.385604 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.386094 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.386647 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.386735 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.542857 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.608643 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 08:29:40 crc kubenswrapper[4870]: I0130 08:29:40.948721 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:41 crc kubenswrapper[4870]: I0130 08:29:41.044927 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:41 crc kubenswrapper[4870]: I0130 08:29:41.047425 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847c478677-wtndf" Jan 30 08:29:42 crc kubenswrapper[4870]: I0130 08:29:42.282376 4870 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podedd09a42-14b6-4161-ba2a-82c4cf4f5983"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podedd09a42-14b6-4161-ba2a-82c4cf4f5983] : Timed out while waiting for systemd to remove kubepods-besteffort-podedd09a42_14b6_4161_ba2a_82c4cf4f5983.slice" Jan 30 08:29:44 crc kubenswrapper[4870]: I0130 08:29:44.259551 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": dial tcp 10.217.0.175:9311: connect: connection refused" Jan 30 08:29:44 crc kubenswrapper[4870]: I0130 08:29:44.259558 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-ddfdbf76d-mfqhx" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": dial tcp 10.217.0.175:9311: connect: connection refused" Jan 30 08:29:44 crc kubenswrapper[4870]: I0130 08:29:44.751803 4870 generic.go:334] "Generic (PLEG): container finished" podID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerID="15d4837d5c345debdddee40f4775635705eb028fe7633d8e6d5c855f92746c7a" exitCode=137 Jan 30 08:29:44 crc kubenswrapper[4870]: I0130 08:29:44.751842 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerDied","Data":"15d4837d5c345debdddee40f4775635705eb028fe7633d8e6d5c855f92746c7a"} Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.471406 4870 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-openstackclient:watcher_latest" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.471677 4870 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.23:5001/podified-master-centos10/openstack-openstackclient:watcher_latest" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.471824 4870 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:38.102.83.23:5001/podified-master-centos10/openstack-openstackclient:watcher_latest,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch687h599h88h5bdh5f9h54dh584h59fh649hb7h78h565h5f9hd5h664hdch69h65fh65h665h5d5h56h579hd9h679h54bh675hb7h5c7h6fh54q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvzdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(204a0d39-f7b0-4468-a82f-9fcc49fc1281): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.473282 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="204a0d39-f7b0-4468-a82f-9fcc49fc1281" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.493055 4870 scope.go:117] "RemoveContainer" containerID="d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.614712 4870 scope.go:117] "RemoveContainer" containerID="532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.654692 4870 scope.go:117] "RemoveContainer" containerID="a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.681810 4870 scope.go:117] "RemoveContainer" containerID="655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.682245 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7\": container with ID starting with 655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7 not found: ID does not exist" containerID="655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.682272 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7"} err="failed to get container status \"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7\": rpc error: code = NotFound desc = could not find container \"655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7\": container with ID starting with 655c44a300fccee93f2efd7de792fed09bcb0d74830f54032964e7370e7ef3a7 not found: ID does not exist" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.682291 4870 scope.go:117] "RemoveContainer" containerID="d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.682626 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08\": container with ID starting with d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08 not found: ID does not exist" containerID="d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.682646 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08"} err="failed to get container status \"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08\": rpc error: code = NotFound desc = could not find container \"d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08\": container with ID starting with d34d7c6bccea77a359da1387457bb8ca89ae48119f56d025bbc10a0703b7de08 not found: ID does not exist" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.682662 4870 scope.go:117] "RemoveContainer" containerID="532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.683291 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac\": container with ID starting with 532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac not found: ID does not exist" containerID="532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.683317 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac"} err="failed to get container status \"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac\": rpc error: code = NotFound desc = could not find container \"532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac\": container with ID starting with 532ae0bccf3ad7a9098b323c71e99784c6b9da57d38020ef2075d91ed4925bac not found: ID does not exist" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.683334 4870 scope.go:117] "RemoveContainer" containerID="a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.683651 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36\": container with ID starting with a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36 not found: ID does not exist" containerID="a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36" Jan 30 08:29:45 crc kubenswrapper[4870]: I0130 08:29:45.683680 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36"} err="failed to get container status \"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36\": rpc error: code = NotFound desc = could not find container \"a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36\": container with ID starting with a3e37bcca1f4eba205fa794e1aff9acbffd8742aa8f58f2fa6ca0b268fdeff36 not found: ID does not exist" Jan 30 08:29:45 crc kubenswrapper[4870]: E0130 08:29:45.765534 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.23:5001/podified-master-centos10/openstack-openstackclient:watcher_latest\\\"\"" pod="openstack/openstackclient" podUID="204a0d39-f7b0-4468-a82f-9fcc49fc1281" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:45.999951 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.007778 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.097465 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.097998 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098081 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle\") pod \"57a4731e-3232-4d27-acf8-9d34ee7570a7\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098117 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data\") pod \"57a4731e-3232-4d27-acf8-9d34ee7570a7\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098148 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qdrz\" (UniqueName: \"kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz\") pod \"57a4731e-3232-4d27-acf8-9d34ee7570a7\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098212 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs\") pod \"57a4731e-3232-4d27-acf8-9d34ee7570a7\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098231 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098303 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom\") pod \"57a4731e-3232-4d27-acf8-9d34ee7570a7\" (UID: \"57a4731e-3232-4d27-acf8-9d34ee7570a7\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098323 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098343 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhtln\" (UniqueName: \"kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098384 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.098406 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle\") pod \"2c1333f8-2564-4b5c-84b9-0045d742c45f\" (UID: \"2c1333f8-2564-4b5c-84b9-0045d742c45f\") " Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.100067 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs" (OuterVolumeSpecName: "logs") pod "57a4731e-3232-4d27-acf8-9d34ee7570a7" (UID: "57a4731e-3232-4d27-acf8-9d34ee7570a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.100696 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs" (OuterVolumeSpecName: "logs") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.101345 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.106655 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz" (OuterVolumeSpecName: "kube-api-access-2qdrz") pod "57a4731e-3232-4d27-acf8-9d34ee7570a7" (UID: "57a4731e-3232-4d27-acf8-9d34ee7570a7"). InnerVolumeSpecName "kube-api-access-2qdrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.108148 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.108546 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57a4731e-3232-4d27-acf8-9d34ee7570a7" (UID: "57a4731e-3232-4d27-acf8-9d34ee7570a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: W0130 08:29:46.111830 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b7679e_30f5_4f8a_96e0_a1581691242d.slice/crio-97c0147556c32e49c5c20b79bd0b3e0fe461928cc98a023b7da24b4aa033c569 WatchSource:0}: Error finding container 97c0147556c32e49c5c20b79bd0b3e0fe461928cc98a023b7da24b4aa033c569: Status 404 returned error can't find the container with id 97c0147556c32e49c5c20b79bd0b3e0fe461928cc98a023b7da24b4aa033c569 Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.113222 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln" (OuterVolumeSpecName: "kube-api-access-xhtln") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "kube-api-access-xhtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.119184 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.122261 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.135575 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.145965 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts" (OuterVolumeSpecName: "scripts") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.175881 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a4731e-3232-4d27-acf8-9d34ee7570a7" (UID: "57a4731e-3232-4d27-acf8-9d34ee7570a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.177968 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data" (OuterVolumeSpecName: "config-data") pod "2c1333f8-2564-4b5c-84b9-0045d742c45f" (UID: "2c1333f8-2564-4b5c-84b9-0045d742c45f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.185266 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data" (OuterVolumeSpecName: "config-data") pod "57a4731e-3232-4d27-acf8-9d34ee7570a7" (UID: "57a4731e-3232-4d27-acf8-9d34ee7570a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200359 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qdrz\" (UniqueName: \"kubernetes.io/projected/57a4731e-3232-4d27-acf8-9d34ee7570a7-kube-api-access-2qdrz\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200388 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a4731e-3232-4d27-acf8-9d34ee7570a7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200398 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c1333f8-2564-4b5c-84b9-0045d742c45f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200407 4870 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200416 4870 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1333f8-2564-4b5c-84b9-0045d742c45f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200425 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhtln\" (UniqueName: \"kubernetes.io/projected/2c1333f8-2564-4b5c-84b9-0045d742c45f-kube-api-access-xhtln\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200433 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200441 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200449 4870 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200457 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1333f8-2564-4b5c-84b9-0045d742c45f-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200466 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.200478 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4731e-3232-4d27-acf8-9d34ee7570a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.786993 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerStarted","Data":"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4"} Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.787383 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerStarted","Data":"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3"} Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.787402 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerStarted","Data":"97c0147556c32e49c5c20b79bd0b3e0fe461928cc98a023b7da24b4aa033c569"} Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.789416 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c1333f8-2564-4b5c-84b9-0045d742c45f","Type":"ContainerDied","Data":"5eca3df12794c5b43fdb77c898c9bd28c39f3103bd50eb3571fc088c025d0cf9"} Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.789454 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.789470 4870 scope.go:117] "RemoveContainer" containerID="15d4837d5c345debdddee40f4775635705eb028fe7633d8e6d5c855f92746c7a" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.793736 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-ddfdbf76d-mfqhx" event={"ID":"57a4731e-3232-4d27-acf8-9d34ee7570a7","Type":"ContainerDied","Data":"4fefb5421067779e4ffb7448501feacbbd8e1262345c29ebcf35ade1e4bf9f85"} Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.793791 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-ddfdbf76d-mfqhx" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.820054 4870 scope.go:117] "RemoveContainer" containerID="135b22419515f0c37fa07d5cae62ea43515dc6460498408338173f7df4e2361b" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.839938 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.852949 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-ddfdbf76d-mfqhx"] Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.855647 4870 scope.go:117] "RemoveContainer" containerID="9a536688d050dc6432091788f5363412218a7bb425a3e180118973e359516afe" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.862386 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.872250 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883089 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:46 crc kubenswrapper[4870]: E0130 08:29:46.883536 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883556 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: E0130 08:29:46.883572 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883579 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api" Jan 30 08:29:46 crc kubenswrapper[4870]: E0130 08:29:46.883601 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883608 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: E0130 08:29:46.883627 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883634 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883826 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883847 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api-log" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883858 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" containerName="barbican-api" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.883868 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" containerName="cinder-api" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.885146 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.889202 4870 scope.go:117] "RemoveContainer" containerID="4bfee3b8db156e8f632e1b810fed5fff1f01f89361fe371e88524396b0b3e740" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.889419 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.889453 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.889546 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 08:29:46 crc kubenswrapper[4870]: I0130 08:29:46.896575 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.024775 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.024832 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-logs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.024957 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-scripts\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.024973 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.025006 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.025031 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.025095 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6hx\" (UniqueName: \"kubernetes.io/projected/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-kube-api-access-2k6hx\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.025154 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.025177 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127252 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127301 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127357 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127376 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-logs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127424 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-scripts\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127437 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127471 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127497 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.127550 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6hx\" (UniqueName: \"kubernetes.io/projected/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-kube-api-access-2k6hx\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.128214 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.128516 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-logs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.132751 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.133677 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.133724 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-config-data\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.135441 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.138488 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-scripts\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.142388 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.147263 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6hx\" (UniqueName: \"kubernetes.io/projected/dcb916a9-c812-4e35-91d2-a4cc4ef78fc3-kube-api-access-2k6hx\") pod \"cinder-api-0\" (UID: \"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3\") " pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.235753 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.700386 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.808373 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d69bf9957-gj6dt" Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.811318 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3","Type":"ContainerStarted","Data":"8c3a38440e961862f41286f49f7349a9380fb90dba8ff37905275bcfa07ea8ce"} Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.817797 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerStarted","Data":"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8"} Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.872172 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.872441 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f966fd88d-sdpcn" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-api" containerID="cri-o://b1d3aae9bf64c7d5adb6c7a0c0cef4cbd05ceee79bf2bdc26b1676e0ef8ac7ff" gracePeriod=30 Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.872826 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f966fd88d-sdpcn" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-httpd" containerID="cri-o://8e106b0c6b2ed513250f13c043895b69dbe1cd77d36b5ecd4e47e2f2226b112e" gracePeriod=30 Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.987947 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.988268 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-log" containerID="cri-o://b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe" gracePeriod=30 Jan 30 08:29:47 crc kubenswrapper[4870]: I0130 08:29:47.988842 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-httpd" containerID="cri-o://5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5" gracePeriod=30 Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.105068 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1333f8-2564-4b5c-84b9-0045d742c45f" path="/var/lib/kubelet/pods/2c1333f8-2564-4b5c-84b9-0045d742c45f/volumes" Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.107110 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a4731e-3232-4d27-acf8-9d34ee7570a7" path="/var/lib/kubelet/pods/57a4731e-3232-4d27-acf8-9d34ee7570a7/volumes" Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.843548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3","Type":"ContainerStarted","Data":"82245963163f3797bd8db1de7522dea3a555555b0f7844f58138397bc618217c"} Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.859734 4870 generic.go:334] "Generic (PLEG): container finished" podID="01e7af93-8480-4484-9558-5455eb00fa2b" containerID="b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe" exitCode=143 Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.860057 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerDied","Data":"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe"} Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.865752 4870 generic.go:334] "Generic (PLEG): container finished" podID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerID="8e106b0c6b2ed513250f13c043895b69dbe1cd77d36b5ecd4e47e2f2226b112e" exitCode=0 Jan 30 08:29:48 crc kubenswrapper[4870]: I0130 08:29:48.865797 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerDied","Data":"8e106b0c6b2ed513250f13c043895b69dbe1cd77d36b5ecd4e47e2f2226b112e"} Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.366254 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482559 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482640 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482697 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482771 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp296\" (UniqueName: \"kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482837 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482866 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.482942 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts\") pod \"01e7af93-8480-4484-9558-5455eb00fa2b\" (UID: \"01e7af93-8480-4484-9558-5455eb00fa2b\") " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.483226 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs" (OuterVolumeSpecName: "logs") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.483548 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.483593 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.488137 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.488685 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts" (OuterVolumeSpecName: "scripts") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.503057 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296" (OuterVolumeSpecName: "kube-api-access-bp296") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "kube-api-access-bp296". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.536390 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.538495 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data" (OuterVolumeSpecName: "config-data") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.560077 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "01e7af93-8480-4484-9558-5455eb00fa2b" (UID: "01e7af93-8480-4484-9558-5455eb00fa2b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585044 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp296\" (UniqueName: \"kubernetes.io/projected/01e7af93-8480-4484-9558-5455eb00fa2b-kube-api-access-bp296\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585076 4870 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585086 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585094 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585102 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e7af93-8480-4484-9558-5455eb00fa2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585134 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.585143 4870 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01e7af93-8480-4484-9558-5455eb00fa2b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.603414 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.686731 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.784942 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.785187 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-log" containerID="cri-o://2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.785291 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-httpd" containerID="cri-o://7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883692 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerStarted","Data":"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249"} Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883793 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-central-agent" containerID="cri-o://50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883830 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883831 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="sg-core" containerID="cri-o://fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883865 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-notification-agent" containerID="cri-o://d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.883891 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="proxy-httpd" containerID="cri-o://ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249" gracePeriod=30 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.897251 4870 generic.go:334] "Generic (PLEG): container finished" podID="01e7af93-8480-4484-9558-5455eb00fa2b" containerID="5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5" exitCode=0 Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.897312 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerDied","Data":"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5"} Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.897337 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01e7af93-8480-4484-9558-5455eb00fa2b","Type":"ContainerDied","Data":"eda82349da897444026066edb7a8f71a1933756e2aef786c074692bf323e90ef"} Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.897354 4870 scope.go:117] "RemoveContainer" containerID="5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.897448 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.926937 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dcb916a9-c812-4e35-91d2-a4cc4ef78fc3","Type":"ContainerStarted","Data":"51db07cabbb6030fef39307c721258594aeda04685efcd0196b6f2f9126031a1"} Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.927142 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.948454 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=12.008972996 podStartE2EDuration="14.948437651s" podCreationTimestamp="2026-01-30 08:29:35 +0000 UTC" firstStartedPulling="2026-01-30 08:29:46.122024559 +0000 UTC m=+1224.817571668" lastFinishedPulling="2026-01-30 08:29:49.061489214 +0000 UTC m=+1227.757036323" observedRunningTime="2026-01-30 08:29:49.911737537 +0000 UTC m=+1228.607284646" watchObservedRunningTime="2026-01-30 08:29:49.948437651 +0000 UTC m=+1228.643984760" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.949450 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9494441829999998 podStartE2EDuration="3.949444183s" podCreationTimestamp="2026-01-30 08:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:49.946039586 +0000 UTC m=+1228.641586695" watchObservedRunningTime="2026-01-30 08:29:49.949444183 +0000 UTC m=+1228.644991292" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.972445 4870 scope.go:117] "RemoveContainer" containerID="b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe" Jan 30 08:29:49 crc kubenswrapper[4870]: I0130 08:29:49.979983 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.010965 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.025090 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:50 crc kubenswrapper[4870]: E0130 08:29:50.026162 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-httpd" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.026184 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-httpd" Jan 30 08:29:50 crc kubenswrapper[4870]: E0130 08:29:50.026216 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-log" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.026224 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-log" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.027609 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-log" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.037964 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" containerName="glance-httpd" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.040782 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.044528 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.057528 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.057758 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.058236 4870 scope.go:117] "RemoveContainer" containerID="5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5" Jan 30 08:29:50 crc kubenswrapper[4870]: E0130 08:29:50.068340 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5\": container with ID starting with 5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5 not found: ID does not exist" containerID="5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.068438 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5"} err="failed to get container status \"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5\": rpc error: code = NotFound desc = could not find container \"5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5\": container with ID starting with 5de813254396c24f18fe9fd8238b591bddeb0213951b91ac54dfc76965423fe5 not found: ID does not exist" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.068495 4870 scope.go:117] "RemoveContainer" containerID="b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe" Jan 30 08:29:50 crc kubenswrapper[4870]: E0130 08:29:50.075551 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe\": container with ID starting with b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe not found: ID does not exist" containerID="b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.075594 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe"} err="failed to get container status \"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe\": rpc error: code = NotFound desc = could not find container \"b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe\": container with ID starting with b7338a02d0d2eab98544b50920230e4bcceedd83d4acefe6a39ec62e93edaabe not found: ID does not exist" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.119466 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e7af93-8480-4484-9558-5455eb00fa2b" path="/var/lib/kubelet/pods/01e7af93-8480-4484-9558-5455eb00fa2b/volumes" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127084 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127161 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127201 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmh9s\" (UniqueName: \"kubernetes.io/projected/743b8276-eb2e-49fa-b493-fb83f20837ed-kube-api-access-fmh9s\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127267 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-logs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127296 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127379 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127521 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.127551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.228811 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.228931 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.228954 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.229015 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.229046 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.229082 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmh9s\" (UniqueName: \"kubernetes.io/projected/743b8276-eb2e-49fa-b493-fb83f20837ed-kube-api-access-fmh9s\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.229125 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-logs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.229150 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.230228 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.230418 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.230674 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743b8276-eb2e-49fa-b493-fb83f20837ed-logs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.234266 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.236466 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.236696 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.237444 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/743b8276-eb2e-49fa-b493-fb83f20837ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.247570 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmh9s\" (UniqueName: \"kubernetes.io/projected/743b8276-eb2e-49fa-b493-fb83f20837ed-kube-api-access-fmh9s\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.264356 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"743b8276-eb2e-49fa-b493-fb83f20837ed\") " pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.438776 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.938321 4870 generic.go:334] "Generic (PLEG): container finished" podID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerID="b1d3aae9bf64c7d5adb6c7a0c0cef4cbd05ceee79bf2bdc26b1676e0ef8ac7ff" exitCode=0 Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.938492 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerDied","Data":"b1d3aae9bf64c7d5adb6c7a0c0cef4cbd05ceee79bf2bdc26b1676e0ef8ac7ff"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.938730 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f966fd88d-sdpcn" event={"ID":"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88","Type":"ContainerDied","Data":"cfedb81ba7d9e195fe41ff8a768c117183039fb6da240c26ec467012add1460c"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.938742 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfedb81ba7d9e195fe41ff8a768c117183039fb6da240c26ec467012add1460c" Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.940637 4870 generic.go:334] "Generic (PLEG): container finished" podID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerID="2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1" exitCode=143 Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.940713 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerDied","Data":"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943417 4870 generic.go:334] "Generic (PLEG): container finished" podID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerID="ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249" exitCode=0 Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943438 4870 generic.go:334] "Generic (PLEG): container finished" podID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerID="fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8" exitCode=2 Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943446 4870 generic.go:334] "Generic (PLEG): container finished" podID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerID="d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4" exitCode=0 Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943440 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerDied","Data":"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943478 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerDied","Data":"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.943495 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerDied","Data":"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4"} Jan 30 08:29:50 crc kubenswrapper[4870]: I0130 08:29:50.962252 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.046586 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config\") pod \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.046651 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config\") pod \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.046794 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs\") pod \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.046834 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8xs7\" (UniqueName: \"kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7\") pod \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.046895 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle\") pod \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\" (UID: \"cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88\") " Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.058006 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" (UID: "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.074459 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7" (OuterVolumeSpecName: "kube-api-access-z8xs7") pod "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" (UID: "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88"). InnerVolumeSpecName "kube-api-access-z8xs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.074970 4870 scope.go:117] "RemoveContainer" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:29:51 crc kubenswrapper[4870]: E0130 08:29:51.075305 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8628af25-d5e4-46a0-adec-4c25ca39676b)\"" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.102426 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config" (OuterVolumeSpecName: "config") pod "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" (UID: "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.126549 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" (UID: "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.131313 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" (UID: "cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.149136 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8xs7\" (UniqueName: \"kubernetes.io/projected/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-kube-api-access-z8xs7\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.149166 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.149175 4870 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.149184 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.149192 4870 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.169213 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.957924 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.959505 4870 generic.go:334] "Generic (PLEG): container finished" podID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerID="7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18" exitCode=0 Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.959567 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerDied","Data":"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18"} Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.959594 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65bb64e7-45f2-4b8d-94f0-34c21ac75042","Type":"ContainerDied","Data":"efd4740849ad794d4e57051943e44782204bcc76846a706c3d143892f1a69cb3"} Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.959612 4870 scope.go:117] "RemoveContainer" containerID="7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.961637 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f966fd88d-sdpcn" Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.962950 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"743b8276-eb2e-49fa-b493-fb83f20837ed","Type":"ContainerStarted","Data":"7045cb5ae41f1b9872cd811da064658bad7a81f498adc2ec631312931ad8e707"} Jan 30 08:29:51 crc kubenswrapper[4870]: I0130 08:29:51.962993 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"743b8276-eb2e-49fa-b493-fb83f20837ed","Type":"ContainerStarted","Data":"8841e1e2b84975d36b7273d680e634b70dead257ee514476a271e3ed38497b8b"} Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.016686 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.017990 4870 scope.go:117] "RemoveContainer" containerID="2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.022731 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f966fd88d-sdpcn"] Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069440 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069571 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069611 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069680 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069748 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069805 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069841 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.069890 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46d96\" (UniqueName: \"kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96\") pod \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\" (UID: \"65bb64e7-45f2-4b8d-94f0-34c21ac75042\") " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.071420 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs" (OuterVolumeSpecName: "logs") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.071694 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.075484 4870 scope.go:117] "RemoveContainer" containerID="7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18" Jan 30 08:29:52 crc kubenswrapper[4870]: E0130 08:29:52.079012 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18\": container with ID starting with 7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18 not found: ID does not exist" containerID="7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.079137 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18"} err="failed to get container status \"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18\": rpc error: code = NotFound desc = could not find container \"7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18\": container with ID starting with 7626fef9e744bdf23c5143f0c48395b24bab8239f9cfdb06207150e977be0d18 not found: ID does not exist" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.079220 4870 scope.go:117] "RemoveContainer" containerID="2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1" Jan 30 08:29:52 crc kubenswrapper[4870]: E0130 08:29:52.079610 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1\": container with ID starting with 2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1 not found: ID does not exist" containerID="2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.079694 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1"} err="failed to get container status \"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1\": rpc error: code = NotFound desc = could not find container \"2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1\": container with ID starting with 2b63187d02cf3f1e134d09ad354fab4f72d729ac96dbea12fd138ec6e9c9f8b1 not found: ID does not exist" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.082730 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.084045 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts" (OuterVolumeSpecName: "scripts") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.101435 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96" (OuterVolumeSpecName: "kube-api-access-46d96") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "kube-api-access-46d96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.108300 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" path="/var/lib/kubelet/pods/cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88/volumes" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.111080 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.129674 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data" (OuterVolumeSpecName: "config-data") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.157753 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65bb64e7-45f2-4b8d-94f0-34c21ac75042" (UID: "65bb64e7-45f2-4b8d-94f0-34c21ac75042"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.172991 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173030 4870 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173049 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173059 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65bb64e7-45f2-4b8d-94f0-34c21ac75042-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173067 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173079 4870 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173088 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46d96\" (UniqueName: \"kubernetes.io/projected/65bb64e7-45f2-4b8d-94f0-34c21ac75042-kube-api-access-46d96\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.173096 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bb64e7-45f2-4b8d-94f0-34c21ac75042-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.192155 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.275079 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.972195 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"743b8276-eb2e-49fa-b493-fb83f20837ed","Type":"ContainerStarted","Data":"68bb452d12c7f1b5a44e23d8dc4ff7ac7ee1407e95b1974d05face87ce2a046d"} Jan 30 08:29:52 crc kubenswrapper[4870]: I0130 08:29:52.973841 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.008353 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.008333909 podStartE2EDuration="4.008333909s" podCreationTimestamp="2026-01-30 08:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:52.995245329 +0000 UTC m=+1231.690792438" watchObservedRunningTime="2026-01-30 08:29:53.008333909 +0000 UTC m=+1231.703881018" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.018909 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.027181 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.035714 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:53 crc kubenswrapper[4870]: E0130 08:29:53.040774 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.040997 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: E0130 08:29:53.041151 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.041232 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: E0130 08:29:53.041317 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-log" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.041384 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-log" Jan 30 08:29:53 crc kubenswrapper[4870]: E0130 08:29:53.041479 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-api" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.041553 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-api" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.041988 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.042063 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6b80f-bafd-4b52-acb3-da3ff3ac5e88" containerName="neutron-api" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.042136 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-httpd" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.042214 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" containerName="glance-log" Jan 30 08:29:53 crc kubenswrapper[4870]: E0130 08:29:53.041984 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65bb64e7_45f2_4b8d_94f0_34c21ac75042.slice/crio-efd4740849ad794d4e57051943e44782204bcc76846a706c3d143892f1a69cb3\": RecentStats: unable to find data in memory cache]" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.043367 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.047418 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.048303 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.060795 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092241 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092315 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092406 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092456 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092677 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092746 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092858 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.092932 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zc6\" (UniqueName: \"kubernetes.io/projected/2efb8d24-a358-43df-af27-d74c4cf88e1f-kube-api-access-l7zc6\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194244 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194295 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zc6\" (UniqueName: \"kubernetes.io/projected/2efb8d24-a358-43df-af27-d74c4cf88e1f-kube-api-access-l7zc6\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194335 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194353 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194379 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194397 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194461 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.194488 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.195524 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.196108 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2efb8d24-a358-43df-af27-d74c4cf88e1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.196108 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.200278 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.201232 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.203331 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.211333 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb8d24-a358-43df-af27-d74c4cf88e1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.224811 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zc6\" (UniqueName: \"kubernetes.io/projected/2efb8d24-a358-43df-af27-d74c4cf88e1f-kube-api-access-l7zc6\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.239419 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2efb8d24-a358-43df-af27-d74c4cf88e1f\") " pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.374846 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 08:29:53 crc kubenswrapper[4870]: I0130 08:29:53.954734 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 08:29:54 crc kubenswrapper[4870]: I0130 08:29:54.000350 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2efb8d24-a358-43df-af27-d74c4cf88e1f","Type":"ContainerStarted","Data":"d3081f983b11573196c6784a19ff207cfa84c01fc7f7702be98c15820b68e8a1"} Jan 30 08:29:54 crc kubenswrapper[4870]: I0130 08:29:54.135939 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bb64e7-45f2-4b8d-94f0-34c21ac75042" path="/var/lib/kubelet/pods/65bb64e7-45f2-4b8d-94f0-34c21ac75042/volumes" Jan 30 08:29:55 crc kubenswrapper[4870]: I0130 08:29:55.011359 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2efb8d24-a358-43df-af27-d74c4cf88e1f","Type":"ContainerStarted","Data":"dd93c75b57f31e2a7057630e58ff1cfb8d29f30e84496e1c78b444d92467d133"} Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.006938 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.023700 4870 generic.go:334] "Generic (PLEG): container finished" podID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerID="50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3" exitCode=0 Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.023759 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.023775 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerDied","Data":"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3"} Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.023802 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e7b7679e-30f5-4f8a-96e0-a1581691242d","Type":"ContainerDied","Data":"97c0147556c32e49c5c20b79bd0b3e0fe461928cc98a023b7da24b4aa033c569"} Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.023818 4870 scope.go:117] "RemoveContainer" containerID="ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.025935 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2efb8d24-a358-43df-af27-d74c4cf88e1f","Type":"ContainerStarted","Data":"ae44013a1a049c729f8672b13eccebed76b78ad97103dcc1ce8e359e600fb29e"} Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.045660 4870 scope.go:117] "RemoveContainer" containerID="fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057226 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057376 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6ng\" (UniqueName: \"kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057409 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057522 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057556 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057584 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057668 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts\") pod \"e7b7679e-30f5-4f8a-96e0-a1581691242d\" (UID: \"e7b7679e-30f5-4f8a-96e0-a1581691242d\") " Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.057700 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.058161 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.058255 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.058355 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b7679e-30f5-4f8a-96e0-a1581691242d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.072743 4870 scope.go:117] "RemoveContainer" containerID="d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.081196 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.081178986 podStartE2EDuration="3.081178986s" podCreationTimestamp="2026-01-30 08:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:56.073803943 +0000 UTC m=+1234.769351052" watchObservedRunningTime="2026-01-30 08:29:56.081178986 +0000 UTC m=+1234.776726085" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.086128 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng" (OuterVolumeSpecName: "kube-api-access-kk6ng") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "kube-api-access-kk6ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.088087 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts" (OuterVolumeSpecName: "scripts") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.095333 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.152199 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.160669 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk6ng\" (UniqueName: \"kubernetes.io/projected/e7b7679e-30f5-4f8a-96e0-a1581691242d-kube-api-access-kk6ng\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.160699 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.160708 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.160717 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.172982 4870 scope.go:117] "RemoveContainer" containerID="50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.199070 4870 scope.go:117] "RemoveContainer" containerID="ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.199492 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249\": container with ID starting with ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249 not found: ID does not exist" containerID="ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.199535 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249"} err="failed to get container status \"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249\": rpc error: code = NotFound desc = could not find container \"ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249\": container with ID starting with ea2dc3bf6b01b6486bc1be0495050fed1248c3c10f9f393da82381c4f2c83249 not found: ID does not exist" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.199564 4870 scope.go:117] "RemoveContainer" containerID="fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.199839 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8\": container with ID starting with fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8 not found: ID does not exist" containerID="fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.199892 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8"} err="failed to get container status \"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8\": rpc error: code = NotFound desc = could not find container \"fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8\": container with ID starting with fb611dcaa5785bae71aa5640bfd419f130d7c82c2c11c264d4870922c0ea5ab8 not found: ID does not exist" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.199915 4870 scope.go:117] "RemoveContainer" containerID="d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.200128 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4\": container with ID starting with d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4 not found: ID does not exist" containerID="d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.200155 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4"} err="failed to get container status \"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4\": rpc error: code = NotFound desc = could not find container \"d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4\": container with ID starting with d0c9d684e4c518328cada1bb39aa6ed8b3369b2dc238ef8d275c56c25fcc12f4 not found: ID does not exist" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.200171 4870 scope.go:117] "RemoveContainer" containerID="50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.200527 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3\": container with ID starting with 50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3 not found: ID does not exist" containerID="50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.200558 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3"} err="failed to get container status \"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3\": rpc error: code = NotFound desc = could not find container \"50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3\": container with ID starting with 50cd0eb040bd3e32d35994ee8e1046627f6c90e1405d0d6272ca90428017a5c3 not found: ID does not exist" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.205933 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data" (OuterVolumeSpecName: "config-data") pod "e7b7679e-30f5-4f8a-96e0-a1581691242d" (UID: "e7b7679e-30f5-4f8a-96e0-a1581691242d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.262445 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b7679e-30f5-4f8a-96e0-a1581691242d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.357172 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.371713 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381032 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.381453 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-notification-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381471 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-notification-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.381519 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-central-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381527 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-central-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.381544 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="sg-core" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381550 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="sg-core" Jan 30 08:29:56 crc kubenswrapper[4870]: E0130 08:29:56.381571 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="proxy-httpd" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381577 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="proxy-httpd" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381800 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="proxy-httpd" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381812 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-notification-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381824 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="ceilometer-central-agent" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.381834 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" containerName="sg-core" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.383559 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.386908 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.387217 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.394978 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466386 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466476 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466528 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh7f\" (UniqueName: \"kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466636 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466687 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.466718 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.568820 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.568906 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.568933 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.568991 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.569032 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.569069 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.569105 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh7f\" (UniqueName: \"kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.569404 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.569435 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.573312 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.575035 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.576072 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.577494 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.587349 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh7f\" (UniqueName: \"kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f\") pod \"ceilometer-0\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.677206 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rxztf"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.678936 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.692438 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rxztf"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.709495 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.758086 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p626s"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.760423 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.779207 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.779478 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8kz\" (UniqueName: \"kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.779594 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkht5\" (UniqueName: \"kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.779714 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.841822 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p626s"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.881216 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkht5\" (UniqueName: \"kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.881273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.881378 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.881400 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8kz\" (UniqueName: \"kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.890318 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.891114 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.944990 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8kz\" (UniqueName: \"kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz\") pod \"nova-cell0-db-create-p626s\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.949495 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkht5\" (UniqueName: \"kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5\") pod \"nova-api-db-create-rxztf\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.966013 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-89bf-account-create-update-s9p8t"] Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.967137 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.970282 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 08:29:56 crc kubenswrapper[4870]: I0130 08:29:56.990956 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-89bf-account-create-update-s9p8t"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.000976 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mz9qm"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.002435 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.002898 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxztf" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.028200 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mz9qm"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.088032 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7c52-account-create-update-bc4lx"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.089344 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.091052 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtlz\" (UniqueName: \"kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.091153 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.091180 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.091224 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdhf\" (UniqueName: \"kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.093113 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.100860 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7c52-account-create-update-bc4lx"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.175287 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9204-account-create-update-pczk5"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.176486 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.183706 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.187601 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.192501 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtlz\" (UniqueName: \"kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.192984 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.193041 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djvzs\" (UniqueName: \"kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.193090 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.193113 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.193210 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdhf\" (UniqueName: \"kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.195639 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.196145 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.215384 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9204-account-create-update-pczk5"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.215739 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdhf\" (UniqueName: \"kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf\") pod \"nova-api-89bf-account-create-update-s9p8t\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.220064 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtlz\" (UniqueName: \"kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz\") pod \"nova-cell1-db-create-mz9qm\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.294398 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.294683 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.294798 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djvzs\" (UniqueName: \"kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.294971 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsbr\" (UniqueName: \"kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.295687 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.314387 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djvzs\" (UniqueName: \"kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs\") pod \"nova-cell0-7c52-account-create-update-bc4lx\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.351700 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.376012 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.397774 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsbr\" (UniqueName: \"kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.397859 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.398846 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.423732 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsbr\" (UniqueName: \"kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr\") pod \"nova-cell1-9204-account-create-update-pczk5\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.457804 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.517640 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.530677 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.673813 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rxztf"] Jan 30 08:29:57 crc kubenswrapper[4870]: I0130 08:29:57.804959 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p626s"] Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.012983 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-89bf-account-create-update-s9p8t"] Jan 30 08:29:58 crc kubenswrapper[4870]: W0130 08:29:58.017942 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf41a5ec0_d6c7_47ab_b69f_c6c2a8bc4981.slice/crio-b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432 WatchSource:0}: Error finding container b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432: Status 404 returned error can't find the container with id b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432 Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.084321 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b7679e-30f5-4f8a-96e0-a1581691242d" path="/var/lib/kubelet/pods/e7b7679e-30f5-4f8a-96e0-a1581691242d/volumes" Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.085185 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxztf" event={"ID":"0467c513-d47e-4251-a042-74a1f0a3ba8e","Type":"ContainerStarted","Data":"5940ff1fdac1d8f6908fb28770dca0cedf697f4f1b3f1ea8731ce8c9ec261c73"} Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.085212 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p626s" event={"ID":"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28","Type":"ContainerStarted","Data":"7f0343f1954a5bd424e045791df859ecb1dc66660b99fda30c6f6833c2f1eac9"} Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.085938 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-89bf-account-create-update-s9p8t" event={"ID":"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981","Type":"ContainerStarted","Data":"b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432"} Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.094498 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerStarted","Data":"2b005b2e14aaf89586d8bc7aa8c8d809f04257ae6b9ce25164b29b34269d45d4"} Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.128556 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mz9qm"] Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.218683 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.219820 4870 scope.go:117] "RemoveContainer" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.221515 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.243813 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7c52-account-create-update-bc4lx"] Jan 30 08:29:58 crc kubenswrapper[4870]: I0130 08:29:58.291556 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9204-account-create-update-pczk5"] Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.103827 4870 generic.go:334] "Generic (PLEG): container finished" podID="b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" containerID="23a36de41e3e5413c9d4a8e53e9d9062761ceb2d5ea6dc50cc6414dd812317b7" exitCode=0 Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.103928 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p626s" event={"ID":"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28","Type":"ContainerDied","Data":"23a36de41e3e5413c9d4a8e53e9d9062761ceb2d5ea6dc50cc6414dd812317b7"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.105606 4870 generic.go:334] "Generic (PLEG): container finished" podID="f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" containerID="10d8adf976aee141ddedf0f0b7d4a560074ff0040c0d225d7fde8dac560cebcd" exitCode=0 Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.105671 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-89bf-account-create-update-s9p8t" event={"ID":"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981","Type":"ContainerDied","Data":"10d8adf976aee141ddedf0f0b7d4a560074ff0040c0d225d7fde8dac560cebcd"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.107106 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerStarted","Data":"f199bc9354b47787d97254410fafef753eb2ceb40e32f6f8bddff5ad8283b2f1"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.108397 4870 generic.go:334] "Generic (PLEG): container finished" podID="6cd82862-2bef-4d86-be4e-38f670a252bd" containerID="046d4010b0f900fe2cbd28328fdfa8554886e3c18049908c92ba7d45ff824b80" exitCode=0 Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.108525 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz9qm" event={"ID":"6cd82862-2bef-4d86-be4e-38f670a252bd","Type":"ContainerDied","Data":"046d4010b0f900fe2cbd28328fdfa8554886e3c18049908c92ba7d45ff824b80"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.108553 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz9qm" event={"ID":"6cd82862-2bef-4d86-be4e-38f670a252bd","Type":"ContainerStarted","Data":"576369a1b841249ea912958de987ffaa8827905c5a3d87b5b878429fd342a86a"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.109947 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerStarted","Data":"2d34ee72279320556589f3ce491e8f0d6dfe6c0bb318d8e32fba2f3e68f571e6"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.109975 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerStarted","Data":"da041b1ddf9f3cc720ab7d45cc97fcefcbe1006c471d1d13b08c5f348d603138"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.111099 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9204-account-create-update-pczk5" event={"ID":"adf298cb-af81-4272-aacd-2d1342eab106","Type":"ContainerStarted","Data":"ad4987bcd683a82b2ef435208c93f9f9d4904561809fe23aa9fd681008a558c6"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.111319 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9204-account-create-update-pczk5" event={"ID":"adf298cb-af81-4272-aacd-2d1342eab106","Type":"ContainerStarted","Data":"ea0efa38a2c03791e8448804b38f7329e64d06e4e94b3314f5ffe51172a0bdc0"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.119786 4870 generic.go:334] "Generic (PLEG): container finished" podID="0467c513-d47e-4251-a042-74a1f0a3ba8e" containerID="b3747f1a7b0dcf93ef3e9971ceb218b892bb2531c608e6a07760d677c25d7633" exitCode=0 Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.119904 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxztf" event={"ID":"0467c513-d47e-4251-a042-74a1f0a3ba8e","Type":"ContainerDied","Data":"b3747f1a7b0dcf93ef3e9971ceb218b892bb2531c608e6a07760d677c25d7633"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.133304 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" event={"ID":"9aa80552-6dc1-43b4-ba32-8fca58595c32","Type":"ContainerStarted","Data":"e7588860011aa90e39e44c8b147a927646ababcde64482fc80c549dc156bfaf7"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.133349 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" event={"ID":"9aa80552-6dc1-43b4-ba32-8fca58595c32","Type":"ContainerStarted","Data":"9957703b2d6046e1e86868f06c5b0be9434430ada46171570bd0491396cf389f"} Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.219351 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-9204-account-create-update-pczk5" podStartSLOduration=2.219335163 podStartE2EDuration="2.219335163s" podCreationTimestamp="2026-01-30 08:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:59.217179005 +0000 UTC m=+1237.912726114" watchObservedRunningTime="2026-01-30 08:29:59.219335163 +0000 UTC m=+1237.914882272" Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.245246 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" podStartSLOduration=2.245219876 podStartE2EDuration="2.245219876s" podCreationTimestamp="2026-01-30 08:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:29:59.235269144 +0000 UTC m=+1237.930816253" watchObservedRunningTime="2026-01-30 08:29:59.245219876 +0000 UTC m=+1237.940766985" Jan 30 08:29:59 crc kubenswrapper[4870]: I0130 08:29:59.726563 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.117457 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.149471 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerStarted","Data":"734eba5ad7a5ddcb654b171f45ad07bc974840fb2dfafb87d21bbd4db9b452f9"} Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.152652 4870 generic.go:334] "Generic (PLEG): container finished" podID="adf298cb-af81-4272-aacd-2d1342eab106" containerID="ad4987bcd683a82b2ef435208c93f9f9d4904561809fe23aa9fd681008a558c6" exitCode=0 Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.152710 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9204-account-create-update-pczk5" event={"ID":"adf298cb-af81-4272-aacd-2d1342eab106","Type":"ContainerDied","Data":"ad4987bcd683a82b2ef435208c93f9f9d4904561809fe23aa9fd681008a558c6"} Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.154801 4870 generic.go:334] "Generic (PLEG): container finished" podID="9aa80552-6dc1-43b4-ba32-8fca58595c32" containerID="e7588860011aa90e39e44c8b147a927646ababcde64482fc80c549dc156bfaf7" exitCode=0 Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.155228 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" event={"ID":"9aa80552-6dc1-43b4-ba32-8fca58595c32","Type":"ContainerDied","Data":"e7588860011aa90e39e44c8b147a927646ababcde64482fc80c549dc156bfaf7"} Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.173021 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c"] Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.178039 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.181843 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.182114 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.215946 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c"] Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.416497 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2h5b\" (UniqueName: \"kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.417432 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.417626 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.439183 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.439345 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.505182 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.509371 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.521948 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2h5b\" (UniqueName: \"kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.522228 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.522380 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.524082 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.529819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.548141 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2h5b\" (UniqueName: \"kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b\") pod \"collect-profiles-29496030-smd5c\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.725437 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.830706 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts\") pod \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.830827 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8kz\" (UniqueName: \"kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz\") pod \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\" (UID: \"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28\") " Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.831498 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" (UID: "b82e1e2b-e78e-4b8f-8303-2ea82b24bf28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.836840 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz" (OuterVolumeSpecName: "kube-api-access-hn8kz") pod "b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" (UID: "b82e1e2b-e78e-4b8f-8303-2ea82b24bf28"). InnerVolumeSpecName "kube-api-access-hn8kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.843944 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.877493 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxztf" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.931136 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.935318 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts\") pod \"0467c513-d47e-4251-a042-74a1f0a3ba8e\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.935489 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkht5\" (UniqueName: \"kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5\") pod \"0467c513-d47e-4251-a042-74a1f0a3ba8e\" (UID: \"0467c513-d47e-4251-a042-74a1f0a3ba8e\") " Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.936040 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0467c513-d47e-4251-a042-74a1f0a3ba8e" (UID: "0467c513-d47e-4251-a042-74a1f0a3ba8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.936063 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.936111 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8kz\" (UniqueName: \"kubernetes.io/projected/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28-kube-api-access-hn8kz\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.943076 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5" (OuterVolumeSpecName: "kube-api-access-mkht5") pod "0467c513-d47e-4251-a042-74a1f0a3ba8e" (UID: "0467c513-d47e-4251-a042-74a1f0a3ba8e"). InnerVolumeSpecName "kube-api-access-mkht5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:00 crc kubenswrapper[4870]: I0130 08:30:00.982750 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.038832 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts\") pod \"6cd82862-2bef-4d86-be4e-38f670a252bd\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.039167 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqtlz\" (UniqueName: \"kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz\") pod \"6cd82862-2bef-4d86-be4e-38f670a252bd\" (UID: \"6cd82862-2bef-4d86-be4e-38f670a252bd\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.039239 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts\") pod \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.039363 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvdhf\" (UniqueName: \"kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf\") pod \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\" (UID: \"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.042313 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" (UID: "f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.045440 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cd82862-2bef-4d86-be4e-38f670a252bd" (UID: "6cd82862-2bef-4d86-be4e-38f670a252bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.046828 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkht5\" (UniqueName: \"kubernetes.io/projected/0467c513-d47e-4251-a042-74a1f0a3ba8e-kube-api-access-mkht5\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.046858 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cd82862-2bef-4d86-be4e-38f670a252bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.046869 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.046900 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0467c513-d47e-4251-a042-74a1f0a3ba8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.052453 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz" (OuterVolumeSpecName: "kube-api-access-rqtlz") pod "6cd82862-2bef-4d86-be4e-38f670a252bd" (UID: "6cd82862-2bef-4d86-be4e-38f670a252bd"). InnerVolumeSpecName "kube-api-access-rqtlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.061033 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf" (OuterVolumeSpecName: "kube-api-access-kvdhf") pod "f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" (UID: "f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981"). InnerVolumeSpecName "kube-api-access-kvdhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.149139 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqtlz\" (UniqueName: \"kubernetes.io/projected/6cd82862-2bef-4d86-be4e-38f670a252bd-kube-api-access-rqtlz\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.150063 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvdhf\" (UniqueName: \"kubernetes.io/projected/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981-kube-api-access-kvdhf\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.166107 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-89bf-account-create-update-s9p8t" event={"ID":"f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981","Type":"ContainerDied","Data":"b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432"} Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.166142 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9366ec5a789f4161d4c8849c5ae5ee374b040f54371faa892bb11cebebfe432" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.166214 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-89bf-account-create-update-s9p8t" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.167909 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz9qm" event={"ID":"6cd82862-2bef-4d86-be4e-38f670a252bd","Type":"ContainerDied","Data":"576369a1b841249ea912958de987ffaa8827905c5a3d87b5b878429fd342a86a"} Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.167942 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="576369a1b841249ea912958de987ffaa8827905c5a3d87b5b878429fd342a86a" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.168008 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz9qm" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.172953 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rxztf" event={"ID":"0467c513-d47e-4251-a042-74a1f0a3ba8e","Type":"ContainerDied","Data":"5940ff1fdac1d8f6908fb28770dca0cedf697f4f1b3f1ea8731ce8c9ec261c73"} Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.173003 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5940ff1fdac1d8f6908fb28770dca0cedf697f4f1b3f1ea8731ce8c9ec261c73" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.173081 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rxztf" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.181979 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p626s" event={"ID":"b82e1e2b-e78e-4b8f-8303-2ea82b24bf28","Type":"ContainerDied","Data":"7f0343f1954a5bd424e045791df859ecb1dc66660b99fda30c6f6833c2f1eac9"} Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.182021 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0343f1954a5bd424e045791df859ecb1dc66660b99fda30c6f6833c2f1eac9" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.182079 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p626s" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.189094 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"204a0d39-f7b0-4468-a82f-9fcc49fc1281","Type":"ContainerStarted","Data":"219179db9ef08389269b0601fc5a735b1c5004657ec7eeb3fa4440e5907cfe2d"} Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.189317 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.189342 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.212751 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.603126337 podStartE2EDuration="41.212717383s" podCreationTimestamp="2026-01-30 08:29:20 +0000 UTC" firstStartedPulling="2026-01-30 08:29:23.751755258 +0000 UTC m=+1202.447302367" lastFinishedPulling="2026-01-30 08:30:00.361346314 +0000 UTC m=+1239.056893413" observedRunningTime="2026-01-30 08:30:01.212636041 +0000 UTC m=+1239.908183160" watchObservedRunningTime="2026-01-30 08:30:01.212717383 +0000 UTC m=+1239.908264492" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.367372 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c"] Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.722053 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.730836 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.782946 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts\") pod \"adf298cb-af81-4272-aacd-2d1342eab106\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.783307 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts\") pod \"9aa80552-6dc1-43b4-ba32-8fca58595c32\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.783385 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djvzs\" (UniqueName: \"kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs\") pod \"9aa80552-6dc1-43b4-ba32-8fca58595c32\" (UID: \"9aa80552-6dc1-43b4-ba32-8fca58595c32\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.783408 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsbr\" (UniqueName: \"kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr\") pod \"adf298cb-af81-4272-aacd-2d1342eab106\" (UID: \"adf298cb-af81-4272-aacd-2d1342eab106\") " Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.785159 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9aa80552-6dc1-43b4-ba32-8fca58595c32" (UID: "9aa80552-6dc1-43b4-ba32-8fca58595c32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.785239 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adf298cb-af81-4272-aacd-2d1342eab106" (UID: "adf298cb-af81-4272-aacd-2d1342eab106"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.794423 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr" (OuterVolumeSpecName: "kube-api-access-6xsbr") pod "adf298cb-af81-4272-aacd-2d1342eab106" (UID: "adf298cb-af81-4272-aacd-2d1342eab106"). InnerVolumeSpecName "kube-api-access-6xsbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.794534 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs" (OuterVolumeSpecName: "kube-api-access-djvzs") pod "9aa80552-6dc1-43b4-ba32-8fca58595c32" (UID: "9aa80552-6dc1-43b4-ba32-8fca58595c32"). InnerVolumeSpecName "kube-api-access-djvzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.886039 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf298cb-af81-4272-aacd-2d1342eab106-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.886068 4870 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aa80552-6dc1-43b4-ba32-8fca58595c32-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.886080 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djvzs\" (UniqueName: \"kubernetes.io/projected/9aa80552-6dc1-43b4-ba32-8fca58595c32-kube-api-access-djvzs\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:01 crc kubenswrapper[4870]: I0130 08:30:01.886090 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsbr\" (UniqueName: \"kubernetes.io/projected/adf298cb-af81-4272-aacd-2d1342eab106-kube-api-access-6xsbr\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.201648 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9204-account-create-update-pczk5" event={"ID":"adf298cb-af81-4272-aacd-2d1342eab106","Type":"ContainerDied","Data":"ea0efa38a2c03791e8448804b38f7329e64d06e4e94b3314f5ffe51172a0bdc0"} Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.201686 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea0efa38a2c03791e8448804b38f7329e64d06e4e94b3314f5ffe51172a0bdc0" Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.203789 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" event={"ID":"9aa80552-6dc1-43b4-ba32-8fca58595c32","Type":"ContainerDied","Data":"9957703b2d6046e1e86868f06c5b0be9434430ada46171570bd0491396cf389f"} Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.203813 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9957703b2d6046e1e86868f06c5b0be9434430ada46171570bd0491396cf389f" Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.203936 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7c52-account-create-update-bc4lx" Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.204031 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9204-account-create-update-pczk5" Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.207047 4870 generic.go:334] "Generic (PLEG): container finished" podID="3ecb0f40-780e-4f90-84aa-17af92178d88" containerID="dad0dbfc8aebf8b014e37d2f50b6d2deebcdfeb8419d761ea14c44680273c1c3" exitCode=0 Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.207157 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" event={"ID":"3ecb0f40-780e-4f90-84aa-17af92178d88","Type":"ContainerDied","Data":"dad0dbfc8aebf8b014e37d2f50b6d2deebcdfeb8419d761ea14c44680273c1c3"} Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.207296 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" event={"ID":"3ecb0f40-780e-4f90-84aa-17af92178d88","Type":"ContainerStarted","Data":"a526ed944565bd757b9a4bba685fb1dd9feaa4eda6024207094fe861d1f2ad25"} Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.212245 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerStarted","Data":"1bc5247fa1f481f03b4623f335ffd2271a8804ce7729b91b95a52302230b32d9"} Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.212960 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-central-agent" containerID="cri-o://da041b1ddf9f3cc720ab7d45cc97fcefcbe1006c471d1d13b08c5f348d603138" gracePeriod=30 Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.213067 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="proxy-httpd" containerID="cri-o://1bc5247fa1f481f03b4623f335ffd2271a8804ce7729b91b95a52302230b32d9" gracePeriod=30 Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.213107 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="sg-core" containerID="cri-o://734eba5ad7a5ddcb654b171f45ad07bc974840fb2dfafb87d21bbd4db9b452f9" gracePeriod=30 Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.213139 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-notification-agent" containerID="cri-o://2d34ee72279320556589f3ce491e8f0d6dfe6c0bb318d8e32fba2f3e68f571e6" gracePeriod=30 Jan 30 08:30:02 crc kubenswrapper[4870]: I0130 08:30:02.242791 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.336838056 podStartE2EDuration="6.242775747s" podCreationTimestamp="2026-01-30 08:29:56 +0000 UTC" firstStartedPulling="2026-01-30 08:29:57.596842816 +0000 UTC m=+1236.292389925" lastFinishedPulling="2026-01-30 08:30:01.502780507 +0000 UTC m=+1240.198327616" observedRunningTime="2026-01-30 08:30:02.241363892 +0000 UTC m=+1240.936911001" watchObservedRunningTime="2026-01-30 08:30:02.242775747 +0000 UTC m=+1240.938322856" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221768 4870 generic.go:334] "Generic (PLEG): container finished" podID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerID="1bc5247fa1f481f03b4623f335ffd2271a8804ce7729b91b95a52302230b32d9" exitCode=0 Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221798 4870 generic.go:334] "Generic (PLEG): container finished" podID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerID="734eba5ad7a5ddcb654b171f45ad07bc974840fb2dfafb87d21bbd4db9b452f9" exitCode=2 Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221806 4870 generic.go:334] "Generic (PLEG): container finished" podID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerID="2d34ee72279320556589f3ce491e8f0d6dfe6c0bb318d8e32fba2f3e68f571e6" exitCode=0 Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221965 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerDied","Data":"1bc5247fa1f481f03b4623f335ffd2271a8804ce7729b91b95a52302230b32d9"} Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221989 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerDied","Data":"734eba5ad7a5ddcb654b171f45ad07bc974840fb2dfafb87d21bbd4db9b452f9"} Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.221998 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerDied","Data":"2d34ee72279320556589f3ce491e8f0d6dfe6c0bb318d8e32fba2f3e68f571e6"} Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.375334 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.376108 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.386507 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.386583 4870 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.428368 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.438753 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.480348 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.592354 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.729088 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2h5b\" (UniqueName: \"kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b\") pod \"3ecb0f40-780e-4f90-84aa-17af92178d88\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.729251 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume\") pod \"3ecb0f40-780e-4f90-84aa-17af92178d88\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.729411 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume\") pod \"3ecb0f40-780e-4f90-84aa-17af92178d88\" (UID: \"3ecb0f40-780e-4f90-84aa-17af92178d88\") " Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.730099 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume" (OuterVolumeSpecName: "config-volume") pod "3ecb0f40-780e-4f90-84aa-17af92178d88" (UID: "3ecb0f40-780e-4f90-84aa-17af92178d88"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.738785 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b" (OuterVolumeSpecName: "kube-api-access-l2h5b") pod "3ecb0f40-780e-4f90-84aa-17af92178d88" (UID: "3ecb0f40-780e-4f90-84aa-17af92178d88"). InnerVolumeSpecName "kube-api-access-l2h5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.743755 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3ecb0f40-780e-4f90-84aa-17af92178d88" (UID: "3ecb0f40-780e-4f90-84aa-17af92178d88"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.835121 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3ecb0f40-780e-4f90-84aa-17af92178d88-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.835155 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2h5b\" (UniqueName: \"kubernetes.io/projected/3ecb0f40-780e-4f90-84aa-17af92178d88-kube-api-access-l2h5b\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:03 crc kubenswrapper[4870]: I0130 08:30:03.835165 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ecb0f40-780e-4f90-84aa-17af92178d88-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:04 crc kubenswrapper[4870]: I0130 08:30:04.234095 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" event={"ID":"3ecb0f40-780e-4f90-84aa-17af92178d88","Type":"ContainerDied","Data":"a526ed944565bd757b9a4bba685fb1dd9feaa4eda6024207094fe861d1f2ad25"} Jan 30 08:30:04 crc kubenswrapper[4870]: I0130 08:30:04.234148 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a526ed944565bd757b9a4bba685fb1dd9feaa4eda6024207094fe861d1f2ad25" Jan 30 08:30:04 crc kubenswrapper[4870]: I0130 08:30:04.234299 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c" Jan 30 08:30:04 crc kubenswrapper[4870]: I0130 08:30:04.234737 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:04 crc kubenswrapper[4870]: I0130 08:30:04.234800 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:05 crc kubenswrapper[4870]: I0130 08:30:05.950972 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:05 crc kubenswrapper[4870]: I0130 08:30:05.960346 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.384799 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gs8vz"] Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385388 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385400 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385421 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd82862-2bef-4d86-be4e-38f670a252bd" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385427 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd82862-2bef-4d86-be4e-38f670a252bd" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385442 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecb0f40-780e-4f90-84aa-17af92178d88" containerName="collect-profiles" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385448 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecb0f40-780e-4f90-84aa-17af92178d88" containerName="collect-profiles" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385457 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa80552-6dc1-43b4-ba32-8fca58595c32" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385463 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa80552-6dc1-43b4-ba32-8fca58595c32" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385477 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385483 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385504 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0467c513-d47e-4251-a042-74a1f0a3ba8e" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385510 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0467c513-d47e-4251-a042-74a1f0a3ba8e" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: E0130 08:30:07.385521 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf298cb-af81-4272-aacd-2d1342eab106" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385527 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf298cb-af81-4272-aacd-2d1342eab106" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385680 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385693 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0467c513-d47e-4251-a042-74a1f0a3ba8e" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385706 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf298cb-af81-4272-aacd-2d1342eab106" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385716 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385728 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa80552-6dc1-43b4-ba32-8fca58595c32" containerName="mariadb-account-create-update" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385740 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecb0f40-780e-4f90-84aa-17af92178d88" containerName="collect-profiles" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.385749 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd82862-2bef-4d86-be4e-38f670a252bd" containerName="mariadb-database-create" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.386487 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.388652 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4t5s8" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.388816 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.388961 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.398080 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gs8vz"] Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.513666 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.513740 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.513799 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.513901 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvq4\" (UniqueName: \"kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.615792 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.616209 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.616266 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.616313 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvq4\" (UniqueName: \"kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.626727 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.633203 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.633742 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.635732 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvq4\" (UniqueName: \"kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4\") pod \"nova-cell0-conductor-db-sync-gs8vz\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:07 crc kubenswrapper[4870]: I0130 08:30:07.704675 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.218103 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.221814 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gs8vz"] Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.253474 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.273866 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" event={"ID":"463149ce-687b-479c-ab61-030371f69acb","Type":"ContainerStarted","Data":"6689eaccda3509e40128b331bffb4a6b1096fb417f907160a5fb9c550c8df122"} Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.274143 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.314700 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:08 crc kubenswrapper[4870]: I0130 08:30:08.372175 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:10 crc kubenswrapper[4870]: I0130 08:30:10.303912 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" containerID="cri-o://f199bc9354b47787d97254410fafef753eb2ceb40e32f6f8bddff5ad8283b2f1" gracePeriod=30 Jan 30 08:30:11 crc kubenswrapper[4870]: I0130 08:30:11.320826 4870 generic.go:334] "Generic (PLEG): container finished" podID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerID="da041b1ddf9f3cc720ab7d45cc97fcefcbe1006c471d1d13b08c5f348d603138" exitCode=0 Jan 30 08:30:11 crc kubenswrapper[4870]: I0130 08:30:11.321179 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerDied","Data":"da041b1ddf9f3cc720ab7d45cc97fcefcbe1006c471d1d13b08c5f348d603138"} Jan 30 08:30:12 crc kubenswrapper[4870]: I0130 08:30:12.334467 4870 generic.go:334] "Generic (PLEG): container finished" podID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerID="f199bc9354b47787d97254410fafef753eb2ceb40e32f6f8bddff5ad8283b2f1" exitCode=0 Jan 30 08:30:12 crc kubenswrapper[4870]: I0130 08:30:12.334508 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerDied","Data":"f199bc9354b47787d97254410fafef753eb2ceb40e32f6f8bddff5ad8283b2f1"} Jan 30 08:30:12 crc kubenswrapper[4870]: I0130 08:30:12.334539 4870 scope.go:117] "RemoveContainer" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.060447 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.069418 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba\": container with ID starting with bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba not found: ID does not exist" containerID="bbcef241c5e081af384f050ba2bd88c0480b531f6aa26fa9ee5db11c38f751ba" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.069487 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.098940 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs\") pod \"8628af25-d5e4-46a0-adec-4c25ca39676b\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099050 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099114 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca\") pod \"8628af25-d5e4-46a0-adec-4c25ca39676b\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099152 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099168 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099282 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099305 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data\") pod \"8628af25-d5e4-46a0-adec-4c25ca39676b\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099325 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvh7f\" (UniqueName: \"kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099363 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.099391 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml\") pod \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\" (UID: \"fa8c4e64-0886-44a9-95cb-6d6cc56748c1\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.101379 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.101632 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.101858 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs" (OuterVolumeSpecName: "logs") pod "8628af25-d5e4-46a0-adec-4c25ca39676b" (UID: "8628af25-d5e4-46a0-adec-4c25ca39676b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.109226 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f" (OuterVolumeSpecName: "kube-api-access-dvh7f") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "kube-api-access-dvh7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.128257 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts" (OuterVolumeSpecName: "scripts") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.142784 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8628af25-d5e4-46a0-adec-4c25ca39676b" (UID: "8628af25-d5e4-46a0-adec-4c25ca39676b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.152633 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.170821 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data" (OuterVolumeSpecName: "config-data") pod "8628af25-d5e4-46a0-adec-4c25ca39676b" (UID: "8628af25-d5e4-46a0-adec-4c25ca39676b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.201002 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9h65\" (UniqueName: \"kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65\") pod \"8628af25-d5e4-46a0-adec-4c25ca39676b\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.201637 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle\") pod \"8628af25-d5e4-46a0-adec-4c25ca39676b\" (UID: \"8628af25-d5e4-46a0-adec-4c25ca39676b\") " Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202461 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8628af25-d5e4-46a0-adec-4c25ca39676b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202491 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202515 4870 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202603 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202614 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202624 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvh7f\" (UniqueName: \"kubernetes.io/projected/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-kube-api-access-dvh7f\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202632 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.202675 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.206821 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65" (OuterVolumeSpecName: "kube-api-access-l9h65") pod "8628af25-d5e4-46a0-adec-4c25ca39676b" (UID: "8628af25-d5e4-46a0-adec-4c25ca39676b"). InnerVolumeSpecName "kube-api-access-l9h65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.213173 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data" (OuterVolumeSpecName: "config-data") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.222114 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa8c4e64-0886-44a9-95cb-6d6cc56748c1" (UID: "fa8c4e64-0886-44a9-95cb-6d6cc56748c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.237075 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8628af25-d5e4-46a0-adec-4c25ca39676b" (UID: "8628af25-d5e4-46a0-adec-4c25ca39676b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.304777 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.304809 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c4e64-0886-44a9-95cb-6d6cc56748c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.304821 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8628af25-d5e4-46a0-adec-4c25ca39676b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.304831 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9h65\" (UniqueName: \"kubernetes.io/projected/8628af25-d5e4-46a0-adec-4c25ca39676b-kube-api-access-l9h65\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.373785 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8628af25-d5e4-46a0-adec-4c25ca39676b","Type":"ContainerDied","Data":"ac51c1c09a44741e0ef19b1f29a0de83b725ade9e973b9a5af1ac06cc507c0dc"} Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.373795 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.373844 4870 scope.go:117] "RemoveContainer" containerID="f199bc9354b47787d97254410fafef753eb2ceb40e32f6f8bddff5ad8283b2f1" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.378597 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa8c4e64-0886-44a9-95cb-6d6cc56748c1","Type":"ContainerDied","Data":"2b005b2e14aaf89586d8bc7aa8c8d809f04257ae6b9ce25164b29b34269d45d4"} Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.378706 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.382076 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" event={"ID":"463149ce-687b-479c-ab61-030371f69acb","Type":"ContainerStarted","Data":"ebc3f13bad52a8c63665a782767852e60e31712851103659b19d6a855c623701"} Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.407743 4870 scope.go:117] "RemoveContainer" containerID="1bc5247fa1f481f03b4623f335ffd2271a8804ce7729b91b95a52302230b32d9" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.410211 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" podStartSLOduration=1.5689813799999999 podStartE2EDuration="9.410199593s" podCreationTimestamp="2026-01-30 08:30:07 +0000 UTC" firstStartedPulling="2026-01-30 08:30:08.234963205 +0000 UTC m=+1246.930510324" lastFinishedPulling="2026-01-30 08:30:16.076181418 +0000 UTC m=+1254.771728537" observedRunningTime="2026-01-30 08:30:16.39580135 +0000 UTC m=+1255.091348449" watchObservedRunningTime="2026-01-30 08:30:16.410199593 +0000 UTC m=+1255.105746702" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.434123 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.447673 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.460212 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.464641 4870 scope.go:117] "RemoveContainer" containerID="734eba5ad7a5ddcb654b171f45ad07bc974840fb2dfafb87d21bbd4db9b452f9" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.488264 4870 scope.go:117] "RemoveContainer" containerID="2d34ee72279320556589f3ce491e8f0d6dfe6c0bb318d8e32fba2f3e68f571e6" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.498250 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.506614 4870 scope.go:117] "RemoveContainer" containerID="da041b1ddf9f3cc720ab7d45cc97fcefcbe1006c471d1d13b08c5f348d603138" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516464 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516821 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="proxy-httpd" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516836 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="proxy-httpd" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516850 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516856 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516865 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-central-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516870 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-central-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516896 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516903 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516918 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-notification-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516923 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-notification-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516937 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516943 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516954 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516960 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: E0130 08:30:16.516971 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="sg-core" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.516976 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="sg-core" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517137 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-notification-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517150 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517157 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="sg-core" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517194 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="proxy-httpd" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517206 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517218 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517226 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" containerName="watcher-decision-engine" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517240 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" containerName="ceilometer-central-agent" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.517797 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.520616 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.540723 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.543455 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.547219 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.547598 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.556144 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.568946 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613254 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613318 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613365 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613423 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613682 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.613925 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.614006 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v65fb\" (UniqueName: \"kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.614056 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjx8w\" (UniqueName: \"kubernetes.io/projected/83b9fe73-9106-4f9b-9272-6f12e3fb8177-kube-api-access-pjx8w\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.614273 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.614392 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-config-data\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.614519 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b9fe73-9106-4f9b-9272-6f12e3fb8177-logs\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b9fe73-9106-4f9b-9272-6f12e3fb8177-logs\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717117 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717161 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717218 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717299 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717340 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717434 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.717606 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.718581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.718811 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b9fe73-9106-4f9b-9272-6f12e3fb8177-logs\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.718953 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.719165 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v65fb\" (UniqueName: \"kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.719222 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjx8w\" (UniqueName: \"kubernetes.io/projected/83b9fe73-9106-4f9b-9272-6f12e3fb8177-kube-api-access-pjx8w\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.719314 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.719370 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-config-data\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.724261 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.724312 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.725588 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.725784 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-config-data\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.736983 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.739695 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/83b9fe73-9106-4f9b-9272-6f12e3fb8177-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.740031 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.744408 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjx8w\" (UniqueName: \"kubernetes.io/projected/83b9fe73-9106-4f9b-9272-6f12e3fb8177-kube-api-access-pjx8w\") pod \"watcher-decision-engine-0\" (UID: \"83b9fe73-9106-4f9b-9272-6f12e3fb8177\") " pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.749411 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v65fb\" (UniqueName: \"kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb\") pod \"ceilometer-0\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " pod="openstack/ceilometer-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.839332 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:16 crc kubenswrapper[4870]: I0130 08:30:16.861228 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:17 crc kubenswrapper[4870]: I0130 08:30:17.331129 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 08:30:17 crc kubenswrapper[4870]: W0130 08:30:17.333552 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83b9fe73_9106_4f9b_9272_6f12e3fb8177.slice/crio-59cd94fd7fcdb82830f502c0f40b770522ca121550e6d1d09e54edebbcbae41d WatchSource:0}: Error finding container 59cd94fd7fcdb82830f502c0f40b770522ca121550e6d1d09e54edebbcbae41d: Status 404 returned error can't find the container with id 59cd94fd7fcdb82830f502c0f40b770522ca121550e6d1d09e54edebbcbae41d Jan 30 08:30:17 crc kubenswrapper[4870]: I0130 08:30:17.401739 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"83b9fe73-9106-4f9b-9272-6f12e3fb8177","Type":"ContainerStarted","Data":"59cd94fd7fcdb82830f502c0f40b770522ca121550e6d1d09e54edebbcbae41d"} Jan 30 08:30:17 crc kubenswrapper[4870]: I0130 08:30:17.423840 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:17 crc kubenswrapper[4870]: W0130 08:30:17.433395 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ccaa05d_1e2c_454e_9fa9_80fe3c2397b4.slice/crio-7be4b49268eb8a069cb4d0016e369267f7bb7570d3aba71266ecdbbd2380fa42 WatchSource:0}: Error finding container 7be4b49268eb8a069cb4d0016e369267f7bb7570d3aba71266ecdbbd2380fa42: Status 404 returned error can't find the container with id 7be4b49268eb8a069cb4d0016e369267f7bb7570d3aba71266ecdbbd2380fa42 Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.087535 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8628af25-d5e4-46a0-adec-4c25ca39676b" path="/var/lib/kubelet/pods/8628af25-d5e4-46a0-adec-4c25ca39676b/volumes" Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.088703 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8c4e64-0886-44a9-95cb-6d6cc56748c1" path="/var/lib/kubelet/pods/fa8c4e64-0886-44a9-95cb-6d6cc56748c1/volumes" Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.424622 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"83b9fe73-9106-4f9b-9272-6f12e3fb8177","Type":"ContainerStarted","Data":"eb647bce7c3be1548b65527b9d43a9c627a697f978ae8303fb810920844ec093"} Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.428299 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerStarted","Data":"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2"} Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.428371 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerStarted","Data":"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7"} Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.428385 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerStarted","Data":"7be4b49268eb8a069cb4d0016e369267f7bb7570d3aba71266ecdbbd2380fa42"} Jan 30 08:30:18 crc kubenswrapper[4870]: I0130 08:30:18.448485 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.448464033 podStartE2EDuration="2.448464033s" podCreationTimestamp="2026-01-30 08:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:18.448213315 +0000 UTC m=+1257.143760424" watchObservedRunningTime="2026-01-30 08:30:18.448464033 +0000 UTC m=+1257.144011142" Jan 30 08:30:19 crc kubenswrapper[4870]: I0130 08:30:19.449414 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerStarted","Data":"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea"} Jan 30 08:30:21 crc kubenswrapper[4870]: I0130 08:30:21.470224 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerStarted","Data":"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261"} Jan 30 08:30:21 crc kubenswrapper[4870]: I0130 08:30:21.470619 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:30:21 crc kubenswrapper[4870]: I0130 08:30:21.503867 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.446582854 podStartE2EDuration="5.50384692s" podCreationTimestamp="2026-01-30 08:30:16 +0000 UTC" firstStartedPulling="2026-01-30 08:30:17.436202689 +0000 UTC m=+1256.131749798" lastFinishedPulling="2026-01-30 08:30:20.493466755 +0000 UTC m=+1259.189013864" observedRunningTime="2026-01-30 08:30:21.500755313 +0000 UTC m=+1260.196302422" watchObservedRunningTime="2026-01-30 08:30:21.50384692 +0000 UTC m=+1260.199394039" Jan 30 08:30:26 crc kubenswrapper[4870]: I0130 08:30:26.840610 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:26 crc kubenswrapper[4870]: I0130 08:30:26.873839 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.141774 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.142121 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-central-agent" containerID="cri-o://1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7" gracePeriod=30 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.142165 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="proxy-httpd" containerID="cri-o://e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261" gracePeriod=30 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.142215 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-notification-agent" containerID="cri-o://38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2" gracePeriod=30 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.142259 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="sg-core" containerID="cri-o://50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea" gracePeriod=30 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.533610 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerID="e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261" exitCode=0 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.533995 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerID="50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea" exitCode=2 Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.533708 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerDied","Data":"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261"} Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.534104 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerDied","Data":"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea"} Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.534317 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:27 crc kubenswrapper[4870]: I0130 08:30:27.568599 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 08:30:28 crc kubenswrapper[4870]: I0130 08:30:28.549354 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerID="1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7" exitCode=0 Jan 30 08:30:28 crc kubenswrapper[4870]: I0130 08:30:28.549436 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerDied","Data":"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7"} Jan 30 08:30:29 crc kubenswrapper[4870]: I0130 08:30:29.957418 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090366 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090405 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090452 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090492 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090539 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090575 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.090608 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v65fb\" (UniqueName: \"kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb\") pod \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\" (UID: \"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4\") " Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.091310 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.091779 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.096031 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts" (OuterVolumeSpecName: "scripts") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.098073 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb" (OuterVolumeSpecName: "kube-api-access-v65fb") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "kube-api-access-v65fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.127115 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.179268 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193013 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193058 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193070 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v65fb\" (UniqueName: \"kubernetes.io/projected/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-kube-api-access-v65fb\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193085 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193096 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.193104 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.197598 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data" (OuterVolumeSpecName: "config-data") pod "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" (UID: "2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.295256 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.587402 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerID="38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2" exitCode=0 Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.587469 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerDied","Data":"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2"} Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.587509 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4","Type":"ContainerDied","Data":"7be4b49268eb8a069cb4d0016e369267f7bb7570d3aba71266ecdbbd2380fa42"} Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.587538 4870 scope.go:117] "RemoveContainer" containerID="e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.587720 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.668748 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.674633 4870 scope.go:117] "RemoveContainer" containerID="50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.695523 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.707773 4870 scope.go:117] "RemoveContainer" containerID="38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.732028 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.732579 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-notification-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.732982 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-notification-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.732998 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="proxy-httpd" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733005 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="proxy-httpd" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.733030 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-central-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733037 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-central-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.733064 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="sg-core" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733620 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="sg-core" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733846 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-notification-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733868 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="sg-core" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733903 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="ceilometer-central-agent" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.733917 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" containerName="proxy-httpd" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.736632 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.739587 4870 scope.go:117] "RemoveContainer" containerID="1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.739862 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.740139 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.743415 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.775933 4870 scope.go:117] "RemoveContainer" containerID="e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.776958 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261\": container with ID starting with e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261 not found: ID does not exist" containerID="e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.777016 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261"} err="failed to get container status \"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261\": rpc error: code = NotFound desc = could not find container \"e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261\": container with ID starting with e4c1e39919fd158510ed9ffc817f40cc08f586836a067977cb44e8b211a54261 not found: ID does not exist" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.777044 4870 scope.go:117] "RemoveContainer" containerID="50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.777545 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea\": container with ID starting with 50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea not found: ID does not exist" containerID="50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.777575 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea"} err="failed to get container status \"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea\": rpc error: code = NotFound desc = could not find container \"50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea\": container with ID starting with 50870a595f976100abdb26b9148f62d048280ed33d045282681532815e480cea not found: ID does not exist" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.777589 4870 scope.go:117] "RemoveContainer" containerID="38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.777960 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2\": container with ID starting with 38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2 not found: ID does not exist" containerID="38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.778105 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2"} err="failed to get container status \"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2\": rpc error: code = NotFound desc = could not find container \"38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2\": container with ID starting with 38570dc31256f2263922dd40e563dcdb7b675589f50624f3d2f68c1f793370d2 not found: ID does not exist" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.778277 4870 scope.go:117] "RemoveContainer" containerID="1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7" Jan 30 08:30:30 crc kubenswrapper[4870]: E0130 08:30:30.778748 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7\": container with ID starting with 1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7 not found: ID does not exist" containerID="1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.778769 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7"} err="failed to get container status \"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7\": rpc error: code = NotFound desc = could not find container \"1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7\": container with ID starting with 1517f3c15fd718c9d9cfa8b01e312f1d0b907c9e264d9479f92a08326c0056b7 not found: ID does not exist" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804415 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804472 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804525 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804543 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.804825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.805000 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wxb\" (UniqueName: \"kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.907629 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908183 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908376 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908607 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908850 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908863 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.908961 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.909303 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wxb\" (UniqueName: \"kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.909832 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.913819 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.917560 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.917753 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.925815 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:30 crc kubenswrapper[4870]: I0130 08:30:30.927188 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wxb\" (UniqueName: \"kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb\") pod \"ceilometer-0\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " pod="openstack/ceilometer-0" Jan 30 08:30:31 crc kubenswrapper[4870]: I0130 08:30:31.069217 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:30:31 crc kubenswrapper[4870]: W0130 08:30:31.495858 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf11d5abc_9e24_41c5_9e26_22a939d70180.slice/crio-0c2a4eddab15cef0cbeaace28cc33784a69c48cca42cc8364520ed9bf9b84959 WatchSource:0}: Error finding container 0c2a4eddab15cef0cbeaace28cc33784a69c48cca42cc8364520ed9bf9b84959: Status 404 returned error can't find the container with id 0c2a4eddab15cef0cbeaace28cc33784a69c48cca42cc8364520ed9bf9b84959 Jan 30 08:30:31 crc kubenswrapper[4870]: I0130 08:30:31.501757 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:30:31 crc kubenswrapper[4870]: I0130 08:30:31.598873 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerStarted","Data":"0c2a4eddab15cef0cbeaace28cc33784a69c48cca42cc8364520ed9bf9b84959"} Jan 30 08:30:32 crc kubenswrapper[4870]: I0130 08:30:32.099310 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4" path="/var/lib/kubelet/pods/2ccaa05d-1e2c-454e-9fa9-80fe3c2397b4/volumes" Jan 30 08:30:32 crc kubenswrapper[4870]: I0130 08:30:32.615847 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerStarted","Data":"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd"} Jan 30 08:30:32 crc kubenswrapper[4870]: I0130 08:30:32.615921 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerStarted","Data":"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb"} Jan 30 08:30:32 crc kubenswrapper[4870]: I0130 08:30:32.618378 4870 generic.go:334] "Generic (PLEG): container finished" podID="463149ce-687b-479c-ab61-030371f69acb" containerID="ebc3f13bad52a8c63665a782767852e60e31712851103659b19d6a855c623701" exitCode=0 Jan 30 08:30:32 crc kubenswrapper[4870]: I0130 08:30:32.618426 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" event={"ID":"463149ce-687b-479c-ab61-030371f69acb","Type":"ContainerDied","Data":"ebc3f13bad52a8c63665a782767852e60e31712851103659b19d6a855c623701"} Jan 30 08:30:33 crc kubenswrapper[4870]: I0130 08:30:33.648471 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerStarted","Data":"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3"} Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.169498 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.292850 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle\") pod \"463149ce-687b-479c-ab61-030371f69acb\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.293353 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data\") pod \"463149ce-687b-479c-ab61-030371f69acb\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.293461 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvq4\" (UniqueName: \"kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4\") pod \"463149ce-687b-479c-ab61-030371f69acb\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.293621 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts\") pod \"463149ce-687b-479c-ab61-030371f69acb\" (UID: \"463149ce-687b-479c-ab61-030371f69acb\") " Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.298665 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4" (OuterVolumeSpecName: "kube-api-access-ddvq4") pod "463149ce-687b-479c-ab61-030371f69acb" (UID: "463149ce-687b-479c-ab61-030371f69acb"). InnerVolumeSpecName "kube-api-access-ddvq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.299116 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts" (OuterVolumeSpecName: "scripts") pod "463149ce-687b-479c-ab61-030371f69acb" (UID: "463149ce-687b-479c-ab61-030371f69acb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.303357 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvq4\" (UniqueName: \"kubernetes.io/projected/463149ce-687b-479c-ab61-030371f69acb-kube-api-access-ddvq4\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.303419 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.320232 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data" (OuterVolumeSpecName: "config-data") pod "463149ce-687b-479c-ab61-030371f69acb" (UID: "463149ce-687b-479c-ab61-030371f69acb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.326587 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463149ce-687b-479c-ab61-030371f69acb" (UID: "463149ce-687b-479c-ab61-030371f69acb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.404144 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.404174 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463149ce-687b-479c-ab61-030371f69acb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.660808 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" event={"ID":"463149ce-687b-479c-ab61-030371f69acb","Type":"ContainerDied","Data":"6689eaccda3509e40128b331bffb4a6b1096fb417f907160a5fb9c550c8df122"} Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.660856 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6689eaccda3509e40128b331bffb4a6b1096fb417f907160a5fb9c550c8df122" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.660957 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gs8vz" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.761664 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 08:30:34 crc kubenswrapper[4870]: E0130 08:30:34.762050 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463149ce-687b-479c-ab61-030371f69acb" containerName="nova-cell0-conductor-db-sync" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.762067 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="463149ce-687b-479c-ab61-030371f69acb" containerName="nova-cell0-conductor-db-sync" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.762218 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="463149ce-687b-479c-ab61-030371f69acb" containerName="nova-cell0-conductor-db-sync" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.762810 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.765642 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.765938 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4t5s8" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.782011 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.809907 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.809997 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.810097 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvfr\" (UniqueName: \"kubernetes.io/projected/9834ddd4-269a-463c-953c-1bf07a7ffdf0-kube-api-access-8jvfr\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.915069 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.915171 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.915260 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvfr\" (UniqueName: \"kubernetes.io/projected/9834ddd4-269a-463c-953c-1bf07a7ffdf0-kube-api-access-8jvfr\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.919075 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.919509 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9834ddd4-269a-463c-953c-1bf07a7ffdf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:34 crc kubenswrapper[4870]: I0130 08:30:34.931751 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvfr\" (UniqueName: \"kubernetes.io/projected/9834ddd4-269a-463c-953c-1bf07a7ffdf0-kube-api-access-8jvfr\") pod \"nova-cell0-conductor-0\" (UID: \"9834ddd4-269a-463c-953c-1bf07a7ffdf0\") " pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.079050 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.594412 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.691015 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9834ddd4-269a-463c-953c-1bf07a7ffdf0","Type":"ContainerStarted","Data":"ba7277c7f1d864e9e8adb0b704d5d7dd97c1180da3e942a33f62c02bd5288731"} Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.694693 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerStarted","Data":"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394"} Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.694821 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:30:35 crc kubenswrapper[4870]: I0130 08:30:35.732137 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.648352429 podStartE2EDuration="5.732111287s" podCreationTimestamp="2026-01-30 08:30:30 +0000 UTC" firstStartedPulling="2026-01-30 08:30:31.500114093 +0000 UTC m=+1270.195661232" lastFinishedPulling="2026-01-30 08:30:34.583872981 +0000 UTC m=+1273.279420090" observedRunningTime="2026-01-30 08:30:35.715072622 +0000 UTC m=+1274.410619751" watchObservedRunningTime="2026-01-30 08:30:35.732111287 +0000 UTC m=+1274.427658406" Jan 30 08:30:36 crc kubenswrapper[4870]: I0130 08:30:36.711762 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9834ddd4-269a-463c-953c-1bf07a7ffdf0","Type":"ContainerStarted","Data":"23b02075ad542fb7c6d85eae2e1e1a8e5e25c2362ace4d98e7c91a23f271e2da"} Jan 30 08:30:36 crc kubenswrapper[4870]: I0130 08:30:36.761107 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7610752659999998 podStartE2EDuration="2.761075266s" podCreationTimestamp="2026-01-30 08:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:36.739504619 +0000 UTC m=+1275.435051728" watchObservedRunningTime="2026-01-30 08:30:36.761075266 +0000 UTC m=+1275.456622415" Jan 30 08:30:37 crc kubenswrapper[4870]: I0130 08:30:37.720650 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.127515 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.756042 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vrk8x"] Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.758772 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.766350 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.766914 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.775785 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrk8x"] Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.835705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.836147 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hkv\" (UniqueName: \"kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.836279 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.836453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.940132 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.943585 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hkv\" (UniqueName: \"kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.943779 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.944027 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.959724 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.962440 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hkv\" (UniqueName: \"kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.966663 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.977725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data\") pod \"nova-cell0-cell-mapping-vrk8x\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.985159 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.986427 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.990082 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 08:30:40 crc kubenswrapper[4870]: I0130 08:30:40.996942 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.045499 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.045620 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85spt\" (UniqueName: \"kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.045689 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.130199 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.143490 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.144774 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.146637 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.152808 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85spt\" (UniqueName: \"kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.152913 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.153075 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.162667 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.178379 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.184371 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.200331 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85spt\" (UniqueName: \"kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt\") pod \"nova-scheduler-0\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.235451 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.238542 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.245333 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.247014 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.255977 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.256646 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257707 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257746 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257834 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257850 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czf2b\" (UniqueName: \"kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257926 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdt9v\" (UniqueName: \"kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257943 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.257963 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.272677 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.306028 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.348673 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.350960 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359154 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdt9v\" (UniqueName: \"kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359185 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359210 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359272 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359292 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359317 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359332 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359374 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qwkz\" (UniqueName: \"kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359407 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359445 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.359459 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czf2b\" (UniqueName: \"kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.363421 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.366373 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.367437 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.370789 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.374077 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.379395 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czf2b\" (UniqueName: \"kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.381070 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.385337 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.402192 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdt9v\" (UniqueName: \"kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v\") pod \"nova-cell1-novncproxy-0\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465103 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465176 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qwkz\" (UniqueName: \"kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465214 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465259 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465310 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465332 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4sd\" (UniqueName: \"kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465353 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465375 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465409 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.465440 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.466234 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.471437 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.471694 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.493910 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qwkz\" (UniqueName: \"kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz\") pod \"nova-api-0\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.565949 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566075 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566201 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566266 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566302 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4sd\" (UniqueName: \"kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566327 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.566973 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.567020 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.567683 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.568170 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.568229 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.575639 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.585515 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4sd\" (UniqueName: \"kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd\") pod \"dnsmasq-dns-7777964479-kzgv2\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.614647 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.632693 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.730730 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.802138 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrk8x"] Jan 30 08:30:41 crc kubenswrapper[4870]: W0130 08:30:41.828055 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e3f03b1_ce9f_4f1d_8bb9_eecb941268c5.slice/crio-cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9 WatchSource:0}: Error finding container cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9: Status 404 returned error can't find the container with id cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9 Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.884330 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m882v"] Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.885515 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.887927 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.889035 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 08:30:41 crc kubenswrapper[4870]: I0130 08:30:41.901177 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m882v"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.024009 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2s2k\" (UniqueName: \"kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.024395 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.024586 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.024835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.039383 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.145290 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.145603 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.145640 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2s2k\" (UniqueName: \"kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.145681 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.169967 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: W0130 08:30:42.170097 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c987d2_eb6f_4ad7_a6b3_97181526dc24.slice/crio-4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48 WatchSource:0}: Error finding container 4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48: Status 404 returned error can't find the container with id 4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48 Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.170223 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2s2k\" (UniqueName: \"kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.172476 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.186464 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data\") pod \"nova-cell1-conductor-db-sync-m882v\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.253816 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.460312 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.564447 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.581075 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.698073 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.789100 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerStarted","Data":"0d8401900436a6761c400a0be0a0bdd42a9aa8031b291c68c5469cef3ec4cd02"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.791440 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerStarted","Data":"2bab53308716b22e97e1a942ac95ceca0d270b7f4b42f3d5be8a0178321b83a8"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.794245 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrk8x" event={"ID":"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5","Type":"ContainerStarted","Data":"d505e04dc454937c02de4ea80fb1b30e9ec281deb651bdc207ab606295f95619"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.794289 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrk8x" event={"ID":"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5","Type":"ContainerStarted","Data":"cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.795501 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21f6c18c-fcc7-4bd5-9a86-81dacd111e90","Type":"ContainerStarted","Data":"ad4cc5b31839637e7fda99e5e00f775d4b4a8266d3fec39e6cf700bdb0b11a6c"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.797095 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c987d2-eb6f-4ad7-a6b3-97181526dc24","Type":"ContainerStarted","Data":"4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.798405 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerStarted","Data":"4241ed3c0f6370f180f8f986d85580b542b235a9c663a81abf39c2245d59012a"} Jan 30 08:30:42 crc kubenswrapper[4870]: I0130 08:30:42.822888 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m882v"] Jan 30 08:30:42 crc kubenswrapper[4870]: W0130 08:30:42.831345 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf7d1e35_e72c_4a05_8a4a_89647f93a26c.slice/crio-1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf WatchSource:0}: Error finding container 1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf: Status 404 returned error can't find the container with id 1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf Jan 30 08:30:43 crc kubenswrapper[4870]: I0130 08:30:43.811361 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerStarted","Data":"895f5e0a2008516657010356d30e83d3b79850fdf910e5ede0c0b5280b3040c2"} Jan 30 08:30:43 crc kubenswrapper[4870]: I0130 08:30:43.819741 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m882v" event={"ID":"df7d1e35-e72c-4a05-8a4a-89647f93a26c","Type":"ContainerStarted","Data":"8042d48e5c92127670e964e37628c795eb4a833864e07f5be3a23644c40ab2aa"} Jan 30 08:30:43 crc kubenswrapper[4870]: I0130 08:30:43.819789 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m882v" event={"ID":"df7d1e35-e72c-4a05-8a4a-89647f93a26c","Type":"ContainerStarted","Data":"1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf"} Jan 30 08:30:43 crc kubenswrapper[4870]: I0130 08:30:43.859350 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vrk8x" podStartSLOduration=3.8593329450000002 podStartE2EDuration="3.859332945s" podCreationTimestamp="2026-01-30 08:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:43.856362082 +0000 UTC m=+1282.551909191" watchObservedRunningTime="2026-01-30 08:30:43.859332945 +0000 UTC m=+1282.554880054" Jan 30 08:30:43 crc kubenswrapper[4870]: I0130 08:30:43.879292 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-m882v" podStartSLOduration=2.879272252 podStartE2EDuration="2.879272252s" podCreationTimestamp="2026-01-30 08:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:43.870066574 +0000 UTC m=+1282.565613703" watchObservedRunningTime="2026-01-30 08:30:43.879272252 +0000 UTC m=+1282.574819361" Jan 30 08:30:44 crc kubenswrapper[4870]: I0130 08:30:44.536500 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:30:44 crc kubenswrapper[4870]: I0130 08:30:44.571856 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:44 crc kubenswrapper[4870]: I0130 08:30:44.835615 4870 generic.go:334] "Generic (PLEG): container finished" podID="d5925267-e75f-4398-af96-6856710c57f3" containerID="895f5e0a2008516657010356d30e83d3b79850fdf910e5ede0c0b5280b3040c2" exitCode=0 Jan 30 08:30:44 crc kubenswrapper[4870]: I0130 08:30:44.836275 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerDied","Data":"895f5e0a2008516657010356d30e83d3b79850fdf910e5ede0c0b5280b3040c2"} Jan 30 08:30:46 crc kubenswrapper[4870]: I0130 08:30:46.866220 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerStarted","Data":"fa09eeaef0e8d067370ba4e9a769247437b75f3bbc783ef72b9b39a713b37db0"} Jan 30 08:30:47 crc kubenswrapper[4870]: I0130 08:30:47.881031 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:47 crc kubenswrapper[4870]: I0130 08:30:47.929526 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7777964479-kzgv2" podStartSLOduration=6.929496956 podStartE2EDuration="6.929496956s" podCreationTimestamp="2026-01-30 08:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:47.900271358 +0000 UTC m=+1286.595818497" watchObservedRunningTime="2026-01-30 08:30:47.929496956 +0000 UTC m=+1286.625044075" Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.735648 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.814644 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.815365 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="dnsmasq-dns" containerID="cri-o://0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e" gracePeriod=10 Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.942036 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerStarted","Data":"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a"} Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.944845 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerStarted","Data":"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11"} Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.946567 4870 generic.go:334] "Generic (PLEG): container finished" podID="8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" containerID="d505e04dc454937c02de4ea80fb1b30e9ec281deb651bdc207ab606295f95619" exitCode=0 Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.946644 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrk8x" event={"ID":"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5","Type":"ContainerDied","Data":"d505e04dc454937c02de4ea80fb1b30e9ec281deb651bdc207ab606295f95619"} Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.947809 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21f6c18c-fcc7-4bd5-9a86-81dacd111e90","Type":"ContainerStarted","Data":"cf022953959b6108a335c22e59a92909beccc351b6ace66848278caae812affb"} Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.948000 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://cf022953959b6108a335c22e59a92909beccc351b6ace66848278caae812affb" gracePeriod=30 Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.959669 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c987d2-eb6f-4ad7-a6b3-97181526dc24","Type":"ContainerStarted","Data":"eeeda60427741389f29ce7682e68db91dead5b11853ceb5b520159a71514e942"} Jan 30 08:30:51 crc kubenswrapper[4870]: I0130 08:30:51.991979 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.125504899 podStartE2EDuration="11.991962834s" podCreationTimestamp="2026-01-30 08:30:40 +0000 UTC" firstStartedPulling="2026-01-30 08:30:42.172226489 +0000 UTC m=+1280.867773598" lastFinishedPulling="2026-01-30 08:30:51.038684394 +0000 UTC m=+1289.734231533" observedRunningTime="2026-01-30 08:30:51.982671293 +0000 UTC m=+1290.678218402" watchObservedRunningTime="2026-01-30 08:30:51.991962834 +0000 UTC m=+1290.687509933" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.007575 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.95936472 podStartE2EDuration="11.007556975s" podCreationTimestamp="2026-01-30 08:30:41 +0000 UTC" firstStartedPulling="2026-01-30 08:30:42.46189952 +0000 UTC m=+1281.157446629" lastFinishedPulling="2026-01-30 08:30:51.510091735 +0000 UTC m=+1290.205638884" observedRunningTime="2026-01-30 08:30:52.001085781 +0000 UTC m=+1290.696632890" watchObservedRunningTime="2026-01-30 08:30:52.007556975 +0000 UTC m=+1290.703104084" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.451669 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.584987 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.585354 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.585376 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xpwr\" (UniqueName: \"kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.585551 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.585590 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.585619 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config\") pod \"3688605b-306e-4093-93d5-b96cae2a80de\" (UID: \"3688605b-306e-4093-93d5-b96cae2a80de\") " Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.598087 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr" (OuterVolumeSpecName: "kube-api-access-6xpwr") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "kube-api-access-6xpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.642683 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.655803 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config" (OuterVolumeSpecName: "config") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.658463 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.662898 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.678155 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3688605b-306e-4093-93d5-b96cae2a80de" (UID: "3688605b-306e-4093-93d5-b96cae2a80de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.687999 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.688106 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.688123 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.688133 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.688144 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xpwr\" (UniqueName: \"kubernetes.io/projected/3688605b-306e-4093-93d5-b96cae2a80de-kube-api-access-6xpwr\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.688154 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3688605b-306e-4093-93d5-b96cae2a80de-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.993704 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerStarted","Data":"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e"} Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.999431 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerStarted","Data":"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87"} Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.999615 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-log" containerID="cri-o://e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" gracePeriod=30 Jan 30 08:30:52 crc kubenswrapper[4870]: I0130 08:30:52.999646 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-metadata" containerID="cri-o://690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" gracePeriod=30 Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.003531 4870 generic.go:334] "Generic (PLEG): container finished" podID="3688605b-306e-4093-93d5-b96cae2a80de" containerID="0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e" exitCode=0 Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.003602 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" event={"ID":"3688605b-306e-4093-93d5-b96cae2a80de","Type":"ContainerDied","Data":"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e"} Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.003645 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.003657 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d675956bc-zzkss" event={"ID":"3688605b-306e-4093-93d5-b96cae2a80de","Type":"ContainerDied","Data":"38880a59bd8e8ad87943840cbaa98251faf8d234264c0ca4ae49cc1e495e8ef5"} Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.003681 4870 scope.go:117] "RemoveContainer" containerID="0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.036864 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.55776425 podStartE2EDuration="12.032859009s" podCreationTimestamp="2026-01-30 08:30:41 +0000 UTC" firstStartedPulling="2026-01-30 08:30:42.574822698 +0000 UTC m=+1281.270369807" lastFinishedPulling="2026-01-30 08:30:51.049917457 +0000 UTC m=+1289.745464566" observedRunningTime="2026-01-30 08:30:53.030235777 +0000 UTC m=+1291.725782906" watchObservedRunningTime="2026-01-30 08:30:53.032859009 +0000 UTC m=+1291.728406138" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.049419 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.112773369 podStartE2EDuration="12.049396308s" podCreationTimestamp="2026-01-30 08:30:41 +0000 UTC" firstStartedPulling="2026-01-30 08:30:42.577692218 +0000 UTC m=+1281.273239327" lastFinishedPulling="2026-01-30 08:30:51.514315137 +0000 UTC m=+1290.209862266" observedRunningTime="2026-01-30 08:30:53.047115816 +0000 UTC m=+1291.742662935" watchObservedRunningTime="2026-01-30 08:30:53.049396308 +0000 UTC m=+1291.744943417" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.055480 4870 scope.go:117] "RemoveContainer" containerID="e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.074393 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.083998 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d675956bc-zzkss"] Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.121100 4870 scope.go:117] "RemoveContainer" containerID="0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e" Jan 30 08:30:53 crc kubenswrapper[4870]: E0130 08:30:53.121712 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e\": container with ID starting with 0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e not found: ID does not exist" containerID="0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.121843 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e"} err="failed to get container status \"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e\": rpc error: code = NotFound desc = could not find container \"0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e\": container with ID starting with 0e54e3d5282b98c208b0bfac76119dbcb9798dc658f490c1212de7282601c32e not found: ID does not exist" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.121962 4870 scope.go:117] "RemoveContainer" containerID="e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8" Jan 30 08:30:53 crc kubenswrapper[4870]: E0130 08:30:53.122368 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8\": container with ID starting with e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8 not found: ID does not exist" containerID="e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.122405 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8"} err="failed to get container status \"e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8\": rpc error: code = NotFound desc = could not find container \"e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8\": container with ID starting with e42b42a5c9ab862c026277e738c25344a1c208eaa58c5cd18419418c9ff99fc8 not found: ID does not exist" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.601047 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.610916 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709149 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle\") pod \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709220 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hkv\" (UniqueName: \"kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv\") pod \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709243 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle\") pod \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709259 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts\") pod \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709275 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs\") pod \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709344 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data\") pod \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709487 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data\") pod \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\" (UID: \"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.709552 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czf2b\" (UniqueName: \"kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b\") pod \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\" (UID: \"a4130fea-be36-47f2-9940-fd3bddcbe3c5\") " Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.710367 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs" (OuterVolumeSpecName: "logs") pod "a4130fea-be36-47f2-9940-fd3bddcbe3c5" (UID: "a4130fea-be36-47f2-9940-fd3bddcbe3c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.710616 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4130fea-be36-47f2-9940-fd3bddcbe3c5-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.714467 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv" (OuterVolumeSpecName: "kube-api-access-m2hkv") pod "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" (UID: "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5"). InnerVolumeSpecName "kube-api-access-m2hkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.715985 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts" (OuterVolumeSpecName: "scripts") pod "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" (UID: "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.722762 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b" (OuterVolumeSpecName: "kube-api-access-czf2b") pod "a4130fea-be36-47f2-9940-fd3bddcbe3c5" (UID: "a4130fea-be36-47f2-9940-fd3bddcbe3c5"). InnerVolumeSpecName "kube-api-access-czf2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.739381 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4130fea-be36-47f2-9940-fd3bddcbe3c5" (UID: "a4130fea-be36-47f2-9940-fd3bddcbe3c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.744840 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" (UID: "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.747027 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data" (OuterVolumeSpecName: "config-data") pod "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" (UID: "8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.749119 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data" (OuterVolumeSpecName: "config-data") pod "a4130fea-be36-47f2-9940-fd3bddcbe3c5" (UID: "a4130fea-be36-47f2-9940-fd3bddcbe3c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813192 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hkv\" (UniqueName: \"kubernetes.io/projected/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-kube-api-access-m2hkv\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813240 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813263 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813282 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813305 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813331 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czf2b\" (UniqueName: \"kubernetes.io/projected/a4130fea-be36-47f2-9940-fd3bddcbe3c5-kube-api-access-czf2b\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:53 crc kubenswrapper[4870]: I0130 08:30:53.813356 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4130fea-be36-47f2-9940-fd3bddcbe3c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019657 4870 generic.go:334] "Generic (PLEG): container finished" podID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerID="690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" exitCode=0 Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019687 4870 generic.go:334] "Generic (PLEG): container finished" podID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerID="e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" exitCode=143 Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019719 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerDied","Data":"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87"} Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019770 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerDied","Data":"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11"} Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019782 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4130fea-be36-47f2-9940-fd3bddcbe3c5","Type":"ContainerDied","Data":"2bab53308716b22e97e1a942ac95ceca0d270b7f4b42f3d5be8a0178321b83a8"} Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019799 4870 scope.go:117] "RemoveContainer" containerID="690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.019816 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.027654 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vrk8x" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.028698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vrk8x" event={"ID":"8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5","Type":"ContainerDied","Data":"cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9"} Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.028735 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cabf47e5c9ce25b5156d8706dcfa4a8797c2b4154837e6413d0ebe4fba915dd9" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.051943 4870 scope.go:117] "RemoveContainer" containerID="e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.097280 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3688605b-306e-4093-93d5-b96cae2a80de" path="/var/lib/kubelet/pods/3688605b-306e-4093-93d5-b96cae2a80de/volumes" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.146041 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.181680 4870 scope.go:117] "RemoveContainer" containerID="690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.182320 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87\": container with ID starting with 690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87 not found: ID does not exist" containerID="690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182357 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87"} err="failed to get container status \"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87\": rpc error: code = NotFound desc = could not find container \"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87\": container with ID starting with 690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87 not found: ID does not exist" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182379 4870 scope.go:117] "RemoveContainer" containerID="e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.182582 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11\": container with ID starting with e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11 not found: ID does not exist" containerID="e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182605 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11"} err="failed to get container status \"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11\": rpc error: code = NotFound desc = could not find container \"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11\": container with ID starting with e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11 not found: ID does not exist" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182619 4870 scope.go:117] "RemoveContainer" containerID="690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182850 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87"} err="failed to get container status \"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87\": rpc error: code = NotFound desc = could not find container \"690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87\": container with ID starting with 690f1c21a74988a80744d4141420ba5d3d5d318a18b2e422108e6e5ee3f7be87 not found: ID does not exist" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.182974 4870 scope.go:117] "RemoveContainer" containerID="e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.183318 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11"} err="failed to get container status \"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11\": rpc error: code = NotFound desc = could not find container \"e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11\": container with ID starting with e040ac1b4d2768ad2859ffb22248f63acb61fa92fe83ab4891bb419e25c4da11 not found: ID does not exist" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.185386 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.185625 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" containerName="nova-scheduler-scheduler" containerID="cri-o://eeeda60427741389f29ce7682e68db91dead5b11853ceb5b520159a71514e942" gracePeriod=30 Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.200639 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.214466 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.224775 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.225350 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="dnsmasq-dns" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225381 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="dnsmasq-dns" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.225410 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-metadata" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225420 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-metadata" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.225457 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="init" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225466 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="init" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.225486 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-log" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225494 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-log" Jan 30 08:30:54 crc kubenswrapper[4870]: E0130 08:30:54.225517 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" containerName="nova-manage" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225525 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" containerName="nova-manage" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225770 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-log" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225802 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" containerName="nova-manage" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225820 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="3688605b-306e-4093-93d5-b96cae2a80de" containerName="dnsmasq-dns" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.225841 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" containerName="nova-metadata-metadata" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.227389 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.229613 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.229859 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.233178 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.330706 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.330773 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.330821 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5ql\" (UniqueName: \"kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.330862 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.330917 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433087 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433280 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433414 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5ql\" (UniqueName: \"kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433547 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433591 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.433732 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.437236 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.437856 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.439460 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.457235 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5ql\" (UniqueName: \"kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql\") pod \"nova-metadata-0\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " pod="openstack/nova-metadata-0" Jan 30 08:30:54 crc kubenswrapper[4870]: I0130 08:30:54.549527 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.039020 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-log" containerID="cri-o://f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" gracePeriod=30 Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.039191 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-api" containerID="cri-o://3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" gracePeriod=30 Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.063302 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.602408 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.653431 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs\") pod \"35497556-3464-49d4-9dc2-8f8153a1db82\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.653513 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle\") pod \"35497556-3464-49d4-9dc2-8f8153a1db82\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.653590 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qwkz\" (UniqueName: \"kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz\") pod \"35497556-3464-49d4-9dc2-8f8153a1db82\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.653622 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data\") pod \"35497556-3464-49d4-9dc2-8f8153a1db82\" (UID: \"35497556-3464-49d4-9dc2-8f8153a1db82\") " Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.653893 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs" (OuterVolumeSpecName: "logs") pod "35497556-3464-49d4-9dc2-8f8153a1db82" (UID: "35497556-3464-49d4-9dc2-8f8153a1db82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.654152 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35497556-3464-49d4-9dc2-8f8153a1db82-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.657246 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz" (OuterVolumeSpecName: "kube-api-access-9qwkz") pod "35497556-3464-49d4-9dc2-8f8153a1db82" (UID: "35497556-3464-49d4-9dc2-8f8153a1db82"). InnerVolumeSpecName "kube-api-access-9qwkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.678795 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data" (OuterVolumeSpecName: "config-data") pod "35497556-3464-49d4-9dc2-8f8153a1db82" (UID: "35497556-3464-49d4-9dc2-8f8153a1db82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.679971 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35497556-3464-49d4-9dc2-8f8153a1db82" (UID: "35497556-3464-49d4-9dc2-8f8153a1db82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.756488 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.756725 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qwkz\" (UniqueName: \"kubernetes.io/projected/35497556-3464-49d4-9dc2-8f8153a1db82-kube-api-access-9qwkz\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:55 crc kubenswrapper[4870]: I0130 08:30:55.756797 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35497556-3464-49d4-9dc2-8f8153a1db82-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054229 4870 generic.go:334] "Generic (PLEG): container finished" podID="35497556-3464-49d4-9dc2-8f8153a1db82" containerID="3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" exitCode=0 Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054641 4870 generic.go:334] "Generic (PLEG): container finished" podID="35497556-3464-49d4-9dc2-8f8153a1db82" containerID="f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" exitCode=143 Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054296 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054318 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerDied","Data":"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054831 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerDied","Data":"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054923 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35497556-3464-49d4-9dc2-8f8153a1db82","Type":"ContainerDied","Data":"4241ed3c0f6370f180f8f986d85580b542b235a9c663a81abf39c2245d59012a"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.054956 4870 scope.go:117] "RemoveContainer" containerID="3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.064978 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerStarted","Data":"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.065082 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerStarted","Data":"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.065110 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerStarted","Data":"01a2218e5e8fc92f78dfaa53cf3e950822f8a4a9869c349d33052df56fc52370"} Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.102098 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.102033179 podStartE2EDuration="2.102033179s" podCreationTimestamp="2026-01-30 08:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:30:56.094708699 +0000 UTC m=+1294.790255848" watchObservedRunningTime="2026-01-30 08:30:56.102033179 +0000 UTC m=+1294.797580348" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.107535 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4130fea-be36-47f2-9940-fd3bddcbe3c5" path="/var/lib/kubelet/pods/a4130fea-be36-47f2-9940-fd3bddcbe3c5/volumes" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.162310 4870 scope.go:117] "RemoveContainer" containerID="f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.199169 4870 scope.go:117] "RemoveContainer" containerID="3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" Jan 30 08:30:56 crc kubenswrapper[4870]: E0130 08:30:56.200108 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e\": container with ID starting with 3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e not found: ID does not exist" containerID="3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.200194 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e"} err="failed to get container status \"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e\": rpc error: code = NotFound desc = could not find container \"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e\": container with ID starting with 3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e not found: ID does not exist" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.200244 4870 scope.go:117] "RemoveContainer" containerID="f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" Jan 30 08:30:56 crc kubenswrapper[4870]: E0130 08:30:56.201305 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a\": container with ID starting with f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a not found: ID does not exist" containerID="f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.201357 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a"} err="failed to get container status \"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a\": rpc error: code = NotFound desc = could not find container \"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a\": container with ID starting with f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a not found: ID does not exist" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.201393 4870 scope.go:117] "RemoveContainer" containerID="3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.201828 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e"} err="failed to get container status \"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e\": rpc error: code = NotFound desc = could not find container \"3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e\": container with ID starting with 3fe2bc5983328951bf21fed89d98d84320282998186c3a7ff39b45c9f020cb9e not found: ID does not exist" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.201862 4870 scope.go:117] "RemoveContainer" containerID="f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.202277 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a"} err="failed to get container status \"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a\": rpc error: code = NotFound desc = could not find container \"f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a\": container with ID starting with f6e6311173c91fb7efc38adadec81d62d3654a13f033876ef586eb719910253a not found: ID does not exist" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.367368 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 08:30:56 crc kubenswrapper[4870]: I0130 08:30:56.576740 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:30:59 crc kubenswrapper[4870]: I0130 08:30:59.550478 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 08:30:59 crc kubenswrapper[4870]: I0130 08:30:59.551116 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 08:31:00 crc kubenswrapper[4870]: I0130 08:31:00.123925 4870 generic.go:334] "Generic (PLEG): container finished" podID="df7d1e35-e72c-4a05-8a4a-89647f93a26c" containerID="8042d48e5c92127670e964e37628c795eb4a833864e07f5be3a23644c40ab2aa" exitCode=0 Jan 30 08:31:00 crc kubenswrapper[4870]: I0130 08:31:00.123987 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m882v" event={"ID":"df7d1e35-e72c-4a05-8a4a-89647f93a26c","Type":"ContainerDied","Data":"8042d48e5c92127670e964e37628c795eb4a833864e07f5be3a23644c40ab2aa"} Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.081392 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.535581 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.585854 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts\") pod \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.585985 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2s2k\" (UniqueName: \"kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k\") pod \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.586190 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data\") pod \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.586263 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle\") pod \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\" (UID: \"df7d1e35-e72c-4a05-8a4a-89647f93a26c\") " Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.594415 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts" (OuterVolumeSpecName: "scripts") pod "df7d1e35-e72c-4a05-8a4a-89647f93a26c" (UID: "df7d1e35-e72c-4a05-8a4a-89647f93a26c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.594481 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k" (OuterVolumeSpecName: "kube-api-access-d2s2k") pod "df7d1e35-e72c-4a05-8a4a-89647f93a26c" (UID: "df7d1e35-e72c-4a05-8a4a-89647f93a26c"). InnerVolumeSpecName "kube-api-access-d2s2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.626476 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data" (OuterVolumeSpecName: "config-data") pod "df7d1e35-e72c-4a05-8a4a-89647f93a26c" (UID: "df7d1e35-e72c-4a05-8a4a-89647f93a26c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.639485 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df7d1e35-e72c-4a05-8a4a-89647f93a26c" (UID: "df7d1e35-e72c-4a05-8a4a-89647f93a26c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.688221 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.688254 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.688268 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7d1e35-e72c-4a05-8a4a-89647f93a26c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:01 crc kubenswrapper[4870]: I0130 08:31:01.688280 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2s2k\" (UniqueName: \"kubernetes.io/projected/df7d1e35-e72c-4a05-8a4a-89647f93a26c-kube-api-access-d2s2k\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.149545 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m882v" event={"ID":"df7d1e35-e72c-4a05-8a4a-89647f93a26c","Type":"ContainerDied","Data":"1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf"} Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.149584 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1180a868984313de66495d96e16fbf7b9e7f4d89169ce27b87d1df78e6c1cbcf" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.149647 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m882v" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.236845 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 08:31:02 crc kubenswrapper[4870]: E0130 08:31:02.237321 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-api" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237346 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-api" Jan 30 08:31:02 crc kubenswrapper[4870]: E0130 08:31:02.237362 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-log" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237370 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-log" Jan 30 08:31:02 crc kubenswrapper[4870]: E0130 08:31:02.237405 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7d1e35-e72c-4a05-8a4a-89647f93a26c" containerName="nova-cell1-conductor-db-sync" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237415 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7d1e35-e72c-4a05-8a4a-89647f93a26c" containerName="nova-cell1-conductor-db-sync" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237639 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-log" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237670 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7d1e35-e72c-4a05-8a4a-89647f93a26c" containerName="nova-cell1-conductor-db-sync" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.237693 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" containerName="nova-api-api" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.238551 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.241099 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.257008 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.298897 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.299035 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.299078 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shltt\" (UniqueName: \"kubernetes.io/projected/e5686258-ed50-49a1-920b-77e9bbe01c55-kube-api-access-shltt\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.400311 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.400589 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shltt\" (UniqueName: \"kubernetes.io/projected/e5686258-ed50-49a1-920b-77e9bbe01c55-kube-api-access-shltt\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.400764 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.407463 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.408463 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5686258-ed50-49a1-920b-77e9bbe01c55-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.422746 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shltt\" (UniqueName: \"kubernetes.io/projected/e5686258-ed50-49a1-920b-77e9bbe01c55-kube-api-access-shltt\") pod \"nova-cell1-conductor-0\" (UID: \"e5686258-ed50-49a1-920b-77e9bbe01c55\") " pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:02 crc kubenswrapper[4870]: I0130 08:31:02.556965 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:03 crc kubenswrapper[4870]: I0130 08:31:03.074160 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 08:31:03 crc kubenswrapper[4870]: I0130 08:31:03.161075 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5686258-ed50-49a1-920b-77e9bbe01c55","Type":"ContainerStarted","Data":"9bc3cf99ea43486766ec0582617eb0c3869fb9fd8ea7d7d2478a53cfa64c25fa"} Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.174413 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5686258-ed50-49a1-920b-77e9bbe01c55","Type":"ContainerStarted","Data":"702865b8243cf2dc5d72020ba917c8fd76ea4d8cb4669689a4569fd86a0eaeb1"} Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.175935 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.195246 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.195226038 podStartE2EDuration="2.195226038s" podCreationTimestamp="2026-01-30 08:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:04.187592868 +0000 UTC m=+1302.883139987" watchObservedRunningTime="2026-01-30 08:31:04.195226038 +0000 UTC m=+1302.890773167" Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.559244 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.559315 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.871333 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:04 crc kubenswrapper[4870]: I0130 08:31:04.871560 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" containerName="kube-state-metrics" containerID="cri-o://9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172" gracePeriod=30 Jan 30 08:31:05 crc kubenswrapper[4870]: E0130 08:31:05.092798 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3b1e9c_90bb_46b7_8e19_edc1388b2a67.slice/crio-conmon-9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3b1e9c_90bb_46b7_8e19_edc1388b2a67.slice/crio-9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172.scope\": RecentStats: unable to find data in memory cache]" Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.207043 4870 generic.go:334] "Generic (PLEG): container finished" podID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" containerID="9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172" exitCode=2 Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.208089 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67","Type":"ContainerDied","Data":"9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172"} Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.436644 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.464981 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckp8w\" (UniqueName: \"kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w\") pod \"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67\" (UID: \"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67\") " Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.478103 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w" (OuterVolumeSpecName: "kube-api-access-ckp8w") pod "dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" (UID: "dd3b1e9c-90bb-46b7-8e19-edc1388b2a67"). InnerVolumeSpecName "kube-api-access-ckp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.568943 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckp8w\" (UniqueName: \"kubernetes.io/projected/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67-kube-api-access-ckp8w\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.578154 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:05 crc kubenswrapper[4870]: I0130 08:31:05.578204 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.221441 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.222276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dd3b1e9c-90bb-46b7-8e19-edc1388b2a67","Type":"ContainerDied","Data":"e0a5d30d3c77180ecb50a167671b3d1f7955018f6057ec5124abb113c1fa8b6f"} Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.222320 4870 scope.go:117] "RemoveContainer" containerID="9f3acd6bbada01f386eced240ac88a963b5d6446c306ab4b66863d5ccf3e1172" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.254799 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.274819 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.284359 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:06 crc kubenswrapper[4870]: E0130 08:31:06.285002 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" containerName="kube-state-metrics" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.285024 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" containerName="kube-state-metrics" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.285282 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" containerName="kube-state-metrics" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.286162 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.288077 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.288281 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.293220 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.382354 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfth5\" (UniqueName: \"kubernetes.io/projected/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-api-access-vfth5\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.382639 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.382692 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.382743 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.485995 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.486301 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfth5\" (UniqueName: \"kubernetes.io/projected/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-api-access-vfth5\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.486397 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.486533 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.490482 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.494284 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.506652 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.508398 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfth5\" (UniqueName: \"kubernetes.io/projected/0deb54ca-48c2-4b35-88c0-dbad5e8b9272-kube-api-access-vfth5\") pod \"kube-state-metrics-0\" (UID: \"0deb54ca-48c2-4b35-88c0-dbad5e8b9272\") " pod="openstack/kube-state-metrics-0" Jan 30 08:31:06 crc kubenswrapper[4870]: I0130 08:31:06.650247 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.083963 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.238079 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0deb54ca-48c2-4b35-88c0-dbad5e8b9272","Type":"ContainerStarted","Data":"7db1ca73141d6540be76da681f4d46da24d920385590ee7f38fa0088a26be648"} Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.260454 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.260725 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-central-agent" containerID="cri-o://0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb" gracePeriod=30 Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.260863 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="proxy-httpd" containerID="cri-o://722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394" gracePeriod=30 Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.260924 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="sg-core" containerID="cri-o://e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3" gracePeriod=30 Jan 30 08:31:07 crc kubenswrapper[4870]: I0130 08:31:07.260956 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-notification-agent" containerID="cri-o://b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd" gracePeriod=30 Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.096106 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3b1e9c-90bb-46b7-8e19-edc1388b2a67" path="/var/lib/kubelet/pods/dd3b1e9c-90bb-46b7-8e19-edc1388b2a67/volumes" Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.253810 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0deb54ca-48c2-4b35-88c0-dbad5e8b9272","Type":"ContainerStarted","Data":"d97bf8f7ba290c5c6467f1cebe12d605a85b69530b2a8da28bee0985109cd9a4"} Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.254990 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257084 4870 generic.go:334] "Generic (PLEG): container finished" podID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerID="722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394" exitCode=0 Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257127 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerDied","Data":"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394"} Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257140 4870 generic.go:334] "Generic (PLEG): container finished" podID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerID="e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3" exitCode=2 Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257156 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerDied","Data":"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3"} Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257161 4870 generic.go:334] "Generic (PLEG): container finished" podID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerID="0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb" exitCode=0 Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.257167 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerDied","Data":"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb"} Jan 30 08:31:08 crc kubenswrapper[4870]: I0130 08:31:08.285541 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.904433029 podStartE2EDuration="2.285511231s" podCreationTimestamp="2026-01-30 08:31:06 +0000 UTC" firstStartedPulling="2026-01-30 08:31:07.097482264 +0000 UTC m=+1305.793029373" lastFinishedPulling="2026-01-30 08:31:07.478560466 +0000 UTC m=+1306.174107575" observedRunningTime="2026-01-30 08:31:08.273436971 +0000 UTC m=+1306.968984140" watchObservedRunningTime="2026-01-30 08:31:08.285511231 +0000 UTC m=+1306.981058380" Jan 30 08:31:12 crc kubenswrapper[4870]: I0130 08:31:12.595203 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.838325 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.950491 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2wxb\" (UniqueName: \"kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.950773 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.950899 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.950997 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.951100 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.951300 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.951397 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd\") pod \"f11d5abc-9e24-41c5-9e26-22a939d70180\" (UID: \"f11d5abc-9e24-41c5-9e26-22a939d70180\") " Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.952146 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.952775 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.956485 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb" (OuterVolumeSpecName: "kube-api-access-l2wxb") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "kube-api-access-l2wxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.966147 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts" (OuterVolumeSpecName: "scripts") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:13 crc kubenswrapper[4870]: I0130 08:31:13.981158 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.029181 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.053781 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data" (OuterVolumeSpecName: "config-data") pod "f11d5abc-9e24-41c5-9e26-22a939d70180" (UID: "f11d5abc-9e24-41c5-9e26-22a939d70180"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054200 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054248 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f11d5abc-9e24-41c5-9e26-22a939d70180-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054259 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2wxb\" (UniqueName: \"kubernetes.io/projected/f11d5abc-9e24-41c5-9e26-22a939d70180-kube-api-access-l2wxb\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054271 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054283 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054291 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.054322 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11d5abc-9e24-41c5-9e26-22a939d70180-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.327654 4870 generic.go:334] "Generic (PLEG): container finished" podID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerID="b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd" exitCode=0 Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.327768 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerDied","Data":"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd"} Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.327809 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.328211 4870 scope.go:117] "RemoveContainer" containerID="722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.328116 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f11d5abc-9e24-41c5-9e26-22a939d70180","Type":"ContainerDied","Data":"0c2a4eddab15cef0cbeaace28cc33784a69c48cca42cc8364520ed9bf9b84959"} Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.359103 4870 scope.go:117] "RemoveContainer" containerID="e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.369268 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.401246 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.413133 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.413699 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-central-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.413724 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-central-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.413746 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="sg-core" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.413755 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="sg-core" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.413780 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="proxy-httpd" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.413792 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="proxy-httpd" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.413808 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-notification-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.413816 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-notification-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.416405 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-notification-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.416426 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="proxy-httpd" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.416446 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="sg-core" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.416459 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" containerName="ceilometer-central-agent" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.418266 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.420872 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.421294 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.422360 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.462048 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.468347 4870 scope.go:117] "RemoveContainer" containerID="b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.490039 4870 scope.go:117] "RemoveContainer" containerID="0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.517604 4870 scope.go:117] "RemoveContainer" containerID="722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.518410 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394\": container with ID starting with 722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394 not found: ID does not exist" containerID="722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.518444 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394"} err="failed to get container status \"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394\": rpc error: code = NotFound desc = could not find container \"722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394\": container with ID starting with 722f92a03dfb1268adbc4665dd21c44e9be63f5363a6d3d067feb62e2ae20394 not found: ID does not exist" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.518466 4870 scope.go:117] "RemoveContainer" containerID="e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.518971 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3\": container with ID starting with e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3 not found: ID does not exist" containerID="e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.519000 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3"} err="failed to get container status \"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3\": rpc error: code = NotFound desc = could not find container \"e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3\": container with ID starting with e877fde2e65db2c46b4cbcbdc12f0a69875cd4efef2febf80a45b73912cd72f3 not found: ID does not exist" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.519015 4870 scope.go:117] "RemoveContainer" containerID="b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.519349 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd\": container with ID starting with b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd not found: ID does not exist" containerID="b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.519371 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd"} err="failed to get container status \"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd\": rpc error: code = NotFound desc = could not find container \"b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd\": container with ID starting with b3b257aee6689f65a0ce66059c2ab16b201b35c4811c0b35bb99d22792234acd not found: ID does not exist" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.519384 4870 scope.go:117] "RemoveContainer" containerID="0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb" Jan 30 08:31:14 crc kubenswrapper[4870]: E0130 08:31:14.519771 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb\": container with ID starting with 0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb not found: ID does not exist" containerID="0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.519808 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb"} err="failed to get container status \"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb\": rpc error: code = NotFound desc = could not find container \"0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb\": container with ID starting with 0440343c5e151c4584f5e06526e461e0ed6bbe86b28f995cc6f02e8fb5a22cdb not found: ID does not exist" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566527 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnc7j\" (UniqueName: \"kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566698 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566792 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566836 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566935 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.566984 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.567124 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.567323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.568405 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.573682 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.573821 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669136 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669203 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669229 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669330 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669353 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669378 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669417 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669473 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnc7j\" (UniqueName: \"kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.669735 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.670056 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.675303 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.675711 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.676497 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.685467 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.686070 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.690334 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnc7j\" (UniqueName: \"kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j\") pod \"ceilometer-0\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " pod="openstack/ceilometer-0" Jan 30 08:31:14 crc kubenswrapper[4870]: I0130 08:31:14.759192 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:31:15 crc kubenswrapper[4870]: I0130 08:31:15.273455 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:31:15 crc kubenswrapper[4870]: W0130 08:31:15.273625 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5bff0c_1f97_4c2d_9f95_46c7c3799d27.slice/crio-878d9967ec40148177071222b9ab0dc547347647130442c6f0bf2d56dd31e4d4 WatchSource:0}: Error finding container 878d9967ec40148177071222b9ab0dc547347647130442c6f0bf2d56dd31e4d4: Status 404 returned error can't find the container with id 878d9967ec40148177071222b9ab0dc547347647130442c6f0bf2d56dd31e4d4 Jan 30 08:31:15 crc kubenswrapper[4870]: I0130 08:31:15.339101 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerStarted","Data":"878d9967ec40148177071222b9ab0dc547347647130442c6f0bf2d56dd31e4d4"} Jan 30 08:31:15 crc kubenswrapper[4870]: I0130 08:31:15.347184 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 08:31:16 crc kubenswrapper[4870]: I0130 08:31:16.088083 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11d5abc-9e24-41c5-9e26-22a939d70180" path="/var/lib/kubelet/pods/f11d5abc-9e24-41c5-9e26-22a939d70180/volumes" Jan 30 08:31:16 crc kubenswrapper[4870]: I0130 08:31:16.660774 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 08:31:17 crc kubenswrapper[4870]: I0130 08:31:17.367503 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerStarted","Data":"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2"} Jan 30 08:31:17 crc kubenswrapper[4870]: I0130 08:31:17.367890 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerStarted","Data":"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531"} Jan 30 08:31:18 crc kubenswrapper[4870]: I0130 08:31:18.379194 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerStarted","Data":"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5"} Jan 30 08:31:20 crc kubenswrapper[4870]: I0130 08:31:20.404671 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerStarted","Data":"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a"} Jan 30 08:31:20 crc kubenswrapper[4870]: I0130 08:31:20.406420 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:31:20 crc kubenswrapper[4870]: I0130 08:31:20.443230 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.336718606 podStartE2EDuration="6.44321049s" podCreationTimestamp="2026-01-30 08:31:14 +0000 UTC" firstStartedPulling="2026-01-30 08:31:15.276653965 +0000 UTC m=+1313.972201064" lastFinishedPulling="2026-01-30 08:31:19.383145849 +0000 UTC m=+1318.078692948" observedRunningTime="2026-01-30 08:31:20.429981712 +0000 UTC m=+1319.125528831" watchObservedRunningTime="2026-01-30 08:31:20.44321049 +0000 UTC m=+1319.138757589" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.426864 4870 generic.go:334] "Generic (PLEG): container finished" podID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" containerID="cf022953959b6108a335c22e59a92909beccc351b6ace66848278caae812affb" exitCode=137 Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.426934 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21f6c18c-fcc7-4bd5-9a86-81dacd111e90","Type":"ContainerDied","Data":"cf022953959b6108a335c22e59a92909beccc351b6ace66848278caae812affb"} Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.427333 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21f6c18c-fcc7-4bd5-9a86-81dacd111e90","Type":"ContainerDied","Data":"ad4cc5b31839637e7fda99e5e00f775d4b4a8266d3fec39e6cf700bdb0b11a6c"} Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.427372 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4cc5b31839637e7fda99e5e00f775d4b4a8266d3fec39e6cf700bdb0b11a6c" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.530674 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.643699 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data\") pod \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.644168 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdt9v\" (UniqueName: \"kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v\") pod \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.644231 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle\") pod \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\" (UID: \"21f6c18c-fcc7-4bd5-9a86-81dacd111e90\") " Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.650025 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v" (OuterVolumeSpecName: "kube-api-access-sdt9v") pod "21f6c18c-fcc7-4bd5-9a86-81dacd111e90" (UID: "21f6c18c-fcc7-4bd5-9a86-81dacd111e90"). InnerVolumeSpecName "kube-api-access-sdt9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.674288 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21f6c18c-fcc7-4bd5-9a86-81dacd111e90" (UID: "21f6c18c-fcc7-4bd5-9a86-81dacd111e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.684527 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data" (OuterVolumeSpecName: "config-data") pod "21f6c18c-fcc7-4bd5-9a86-81dacd111e90" (UID: "21f6c18c-fcc7-4bd5-9a86-81dacd111e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.746518 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdt9v\" (UniqueName: \"kubernetes.io/projected/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-kube-api-access-sdt9v\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.746550 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:22 crc kubenswrapper[4870]: I0130 08:31:22.746563 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f6c18c-fcc7-4bd5-9a86-81dacd111e90-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.437252 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.471507 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.481038 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.504414 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:31:23 crc kubenswrapper[4870]: E0130 08:31:23.504997 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.505022 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.505295 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.506259 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.510108 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.513261 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.515521 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.517136 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.667949 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.668406 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfglr\" (UniqueName: \"kubernetes.io/projected/f6319a2a-594b-4da1-be42-ad0918221515-kube-api-access-nfglr\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.668604 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.668641 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.668723 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.770921 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.771223 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.771383 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.771605 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.771737 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfglr\" (UniqueName: \"kubernetes.io/projected/f6319a2a-594b-4da1-be42-ad0918221515-kube-api-access-nfglr\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.776184 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.776631 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.777435 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.777808 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6319a2a-594b-4da1-be42-ad0918221515-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.788843 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfglr\" (UniqueName: \"kubernetes.io/projected/f6319a2a-594b-4da1-be42-ad0918221515-kube-api-access-nfglr\") pod \"nova-cell1-novncproxy-0\" (UID: \"f6319a2a-594b-4da1-be42-ad0918221515\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:23 crc kubenswrapper[4870]: I0130 08:31:23.835387 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.095831 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f6c18c-fcc7-4bd5-9a86-81dacd111e90" path="/var/lib/kubelet/pods/21f6c18c-fcc7-4bd5-9a86-81dacd111e90/volumes" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.327115 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 08:31:24 crc kubenswrapper[4870]: W0130 08:31:24.379495 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6319a2a_594b_4da1_be42_ad0918221515.slice/crio-3b18cc7676d8de144f48364305f73360d994ac368fd4a0ebf944668d0f78010d WatchSource:0}: Error finding container 3b18cc7676d8de144f48364305f73360d994ac368fd4a0ebf944668d0f78010d: Status 404 returned error can't find the container with id 3b18cc7676d8de144f48364305f73360d994ac368fd4a0ebf944668d0f78010d Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.460698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6319a2a-594b-4da1-be42-ad0918221515","Type":"ContainerStarted","Data":"3b18cc7676d8de144f48364305f73360d994ac368fd4a0ebf944668d0f78010d"} Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.462917 4870 generic.go:334] "Generic (PLEG): container finished" podID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" containerID="eeeda60427741389f29ce7682e68db91dead5b11853ceb5b520159a71514e942" exitCode=137 Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.462915 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c987d2-eb6f-4ad7-a6b3-97181526dc24","Type":"ContainerDied","Data":"eeeda60427741389f29ce7682e68db91dead5b11853ceb5b520159a71514e942"} Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.727318 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.893332 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data\") pod \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.893422 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85spt\" (UniqueName: \"kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt\") pod \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.893528 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle\") pod \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\" (UID: \"e7c987d2-eb6f-4ad7-a6b3-97181526dc24\") " Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.898045 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt" (OuterVolumeSpecName: "kube-api-access-85spt") pod "e7c987d2-eb6f-4ad7-a6b3-97181526dc24" (UID: "e7c987d2-eb6f-4ad7-a6b3-97181526dc24"). InnerVolumeSpecName "kube-api-access-85spt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.919606 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data" (OuterVolumeSpecName: "config-data") pod "e7c987d2-eb6f-4ad7-a6b3-97181526dc24" (UID: "e7c987d2-eb6f-4ad7-a6b3-97181526dc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.920097 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7c987d2-eb6f-4ad7-a6b3-97181526dc24" (UID: "e7c987d2-eb6f-4ad7-a6b3-97181526dc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.996075 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85spt\" (UniqueName: \"kubernetes.io/projected/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-kube-api-access-85spt\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.996121 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:24 crc kubenswrapper[4870]: I0130 08:31:24.996134 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c987d2-eb6f-4ad7-a6b3-97181526dc24-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.250477 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.250917 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.478978 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f6319a2a-594b-4da1-be42-ad0918221515","Type":"ContainerStarted","Data":"985c9a0893bc80da949bc398f770276f79a8adc7c1e7d4dab27df98aa10edf8b"} Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.485226 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7c987d2-eb6f-4ad7-a6b3-97181526dc24","Type":"ContainerDied","Data":"4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48"} Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.485280 4870 scope.go:117] "RemoveContainer" containerID="eeeda60427741389f29ce7682e68db91dead5b11853ceb5b520159a71514e942" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.485290 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.512960 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.51293849 podStartE2EDuration="2.51293849s" podCreationTimestamp="2026-01-30 08:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:25.511949799 +0000 UTC m=+1324.207496918" watchObservedRunningTime="2026-01-30 08:31:25.51293849 +0000 UTC m=+1324.208485609" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.551536 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.568987 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.583273 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:25 crc kubenswrapper[4870]: E0130 08:31:25.583854 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" containerName="nova-scheduler-scheduler" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.583901 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" containerName="nova-scheduler-scheduler" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.584187 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" containerName="nova-scheduler-scheduler" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.585092 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.587902 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.596558 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.715407 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nzwt\" (UniqueName: \"kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.715755 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.715876 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: E0130 08:31:25.763189 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c987d2_eb6f_4ad7_a6b3_97181526dc24.slice/crio-4844fa8b96732628baf7088ac1c015a802eaab1095cb6ae769aae1a9f257db48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c987d2_eb6f_4ad7_a6b3_97181526dc24.slice\": RecentStats: unable to find data in memory cache]" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.817575 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nzwt\" (UniqueName: \"kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.817661 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.817685 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.825357 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.837868 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.848254 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nzwt\" (UniqueName: \"kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt\") pod \"nova-scheduler-0\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:25 crc kubenswrapper[4870]: I0130 08:31:25.913380 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.102653 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c987d2-eb6f-4ad7-a6b3-97181526dc24" path="/var/lib/kubelet/pods/e7c987d2-eb6f-4ad7-a6b3-97181526dc24/volumes" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.143458 4870 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod35497556-3464-49d4-9dc2-8f8153a1db82"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod35497556-3464-49d4-9dc2-8f8153a1db82] : Timed out while waiting for systemd to remove kubepods-besteffort-pod35497556_3464_49d4_9dc2_8f8153a1db82.slice" Jan 30 08:31:26 crc kubenswrapper[4870]: E0130 08:31:26.143519 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod35497556-3464-49d4-9dc2-8f8153a1db82] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod35497556-3464-49d4-9dc2-8f8153a1db82] : Timed out while waiting for systemd to remove kubepods-besteffort-pod35497556_3464_49d4_9dc2_8f8153a1db82.slice" pod="openstack/nova-api-0" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.472586 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.501606 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7769eb04-0ff3-41ef-9977-e66563ea4085","Type":"ContainerStarted","Data":"e5d61cbfee6394ab7139462d11ff3290876e24938531670c58f9dd81e3a55b6c"} Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.501680 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.576588 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.594715 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.618637 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.621205 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.623808 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.636462 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.752231 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.752308 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6784l\" (UniqueName: \"kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.752418 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.752500 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.854309 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.854447 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.854626 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.854693 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6784l\" (UniqueName: \"kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.856513 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.860450 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.862717 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.877730 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6784l\" (UniqueName: \"kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l\") pod \"nova-api-0\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " pod="openstack/nova-api-0" Jan 30 08:31:26 crc kubenswrapper[4870]: I0130 08:31:26.953969 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:27 crc kubenswrapper[4870]: I0130 08:31:27.438905 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:27 crc kubenswrapper[4870]: W0130 08:31:27.453719 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1de7242_d69a_4e86_8461_a771c855adf9.slice/crio-e2b4d49702eb3a18798f8e9ef5ef000c9c2ecb3963af90a645e06a8320f3524e WatchSource:0}: Error finding container e2b4d49702eb3a18798f8e9ef5ef000c9c2ecb3963af90a645e06a8320f3524e: Status 404 returned error can't find the container with id e2b4d49702eb3a18798f8e9ef5ef000c9c2ecb3963af90a645e06a8320f3524e Jan 30 08:31:27 crc kubenswrapper[4870]: I0130 08:31:27.519824 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7769eb04-0ff3-41ef-9977-e66563ea4085","Type":"ContainerStarted","Data":"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66"} Jan 30 08:31:27 crc kubenswrapper[4870]: I0130 08:31:27.520866 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerStarted","Data":"e2b4d49702eb3a18798f8e9ef5ef000c9c2ecb3963af90a645e06a8320f3524e"} Jan 30 08:31:27 crc kubenswrapper[4870]: I0130 08:31:27.547598 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.547583745 podStartE2EDuration="2.547583745s" podCreationTimestamp="2026-01-30 08:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:27.540523023 +0000 UTC m=+1326.236070142" watchObservedRunningTime="2026-01-30 08:31:27.547583745 +0000 UTC m=+1326.243130854" Jan 30 08:31:28 crc kubenswrapper[4870]: I0130 08:31:28.089650 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35497556-3464-49d4-9dc2-8f8153a1db82" path="/var/lib/kubelet/pods/35497556-3464-49d4-9dc2-8f8153a1db82/volumes" Jan 30 08:31:28 crc kubenswrapper[4870]: I0130 08:31:28.537043 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerStarted","Data":"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d"} Jan 30 08:31:28 crc kubenswrapper[4870]: I0130 08:31:28.537462 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerStarted","Data":"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24"} Jan 30 08:31:28 crc kubenswrapper[4870]: I0130 08:31:28.836126 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:29 crc kubenswrapper[4870]: I0130 08:31:29.581250 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.581227469 podStartE2EDuration="3.581227469s" podCreationTimestamp="2026-01-30 08:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:29.57427793 +0000 UTC m=+1328.269825049" watchObservedRunningTime="2026-01-30 08:31:29.581227469 +0000 UTC m=+1328.276774588" Jan 30 08:31:30 crc kubenswrapper[4870]: I0130 08:31:30.913546 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 08:31:33 crc kubenswrapper[4870]: I0130 08:31:33.836124 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:33 crc kubenswrapper[4870]: I0130 08:31:33.855395 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.645477 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.854042 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hhwc4"] Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.855675 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.858609 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.858857 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.866492 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hhwc4"] Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.942541 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.942587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5d7x\" (UniqueName: \"kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.942609 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:34 crc kubenswrapper[4870]: I0130 08:31:34.942637 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.044566 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.044620 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5d7x\" (UniqueName: \"kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.044652 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.044695 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.056082 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.056100 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.057556 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.068609 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5d7x\" (UniqueName: \"kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x\") pod \"nova-cell1-cell-mapping-hhwc4\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.182263 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.913597 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 08:31:35 crc kubenswrapper[4870]: I0130 08:31:35.979899 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.217335 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hhwc4"] Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.633796 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hhwc4" event={"ID":"c1dfb454-58dc-4c83-b25e-cabaab6cb747","Type":"ContainerStarted","Data":"4b78d78b78d21abcc7506de0b24454a50e055736a3c90f711e671ea39c5653ae"} Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.633869 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hhwc4" event={"ID":"c1dfb454-58dc-4c83-b25e-cabaab6cb747","Type":"ContainerStarted","Data":"0018915fa01663a767ed29852c909f16d8cf1a8a4ada56611946c2a2c6b2ea35"} Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.661483 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hhwc4" podStartSLOduration=2.661459423 podStartE2EDuration="2.661459423s" podCreationTimestamp="2026-01-30 08:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:36.651519739 +0000 UTC m=+1335.347066868" watchObservedRunningTime="2026-01-30 08:31:36.661459423 +0000 UTC m=+1335.357006542" Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.684192 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.955047 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:31:36 crc kubenswrapper[4870]: I0130 08:31:36.955115 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:31:37 crc kubenswrapper[4870]: I0130 08:31:37.996289 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:37 crc kubenswrapper[4870]: I0130 08:31:37.996305 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:42 crc kubenswrapper[4870]: I0130 08:31:41.703612 4870 generic.go:334] "Generic (PLEG): container finished" podID="c1dfb454-58dc-4c83-b25e-cabaab6cb747" containerID="4b78d78b78d21abcc7506de0b24454a50e055736a3c90f711e671ea39c5653ae" exitCode=0 Jan 30 08:31:42 crc kubenswrapper[4870]: I0130 08:31:41.703760 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hhwc4" event={"ID":"c1dfb454-58dc-4c83-b25e-cabaab6cb747","Type":"ContainerDied","Data":"4b78d78b78d21abcc7506de0b24454a50e055736a3c90f711e671ea39c5653ae"} Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.167321 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.315074 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts\") pod \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.315256 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5d7x\" (UniqueName: \"kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x\") pod \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.315374 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle\") pod \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.315512 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data\") pod \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\" (UID: \"c1dfb454-58dc-4c83-b25e-cabaab6cb747\") " Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.321616 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x" (OuterVolumeSpecName: "kube-api-access-b5d7x") pod "c1dfb454-58dc-4c83-b25e-cabaab6cb747" (UID: "c1dfb454-58dc-4c83-b25e-cabaab6cb747"). InnerVolumeSpecName "kube-api-access-b5d7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.323183 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts" (OuterVolumeSpecName: "scripts") pod "c1dfb454-58dc-4c83-b25e-cabaab6cb747" (UID: "c1dfb454-58dc-4c83-b25e-cabaab6cb747"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.349055 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1dfb454-58dc-4c83-b25e-cabaab6cb747" (UID: "c1dfb454-58dc-4c83-b25e-cabaab6cb747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.370191 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data" (OuterVolumeSpecName: "config-data") pod "c1dfb454-58dc-4c83-b25e-cabaab6cb747" (UID: "c1dfb454-58dc-4c83-b25e-cabaab6cb747"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.418421 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5d7x\" (UniqueName: \"kubernetes.io/projected/c1dfb454-58dc-4c83-b25e-cabaab6cb747-kube-api-access-b5d7x\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.418469 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.418485 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.418496 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1dfb454-58dc-4c83-b25e-cabaab6cb747-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.733567 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hhwc4" event={"ID":"c1dfb454-58dc-4c83-b25e-cabaab6cb747","Type":"ContainerDied","Data":"0018915fa01663a767ed29852c909f16d8cf1a8a4ada56611946c2a2c6b2ea35"} Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.734024 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0018915fa01663a767ed29852c909f16d8cf1a8a4ada56611946c2a2c6b2ea35" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.733639 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hhwc4" Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.933383 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.933592 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerName="nova-scheduler-scheduler" containerID="cri-o://aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" gracePeriod=30 Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.985472 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.985711 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-log" containerID="cri-o://91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24" gracePeriod=30 Jan 30 08:31:43 crc kubenswrapper[4870]: I0130 08:31:43.985818 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-api" containerID="cri-o://2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d" gracePeriod=30 Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.006407 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.006661 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" containerID="cri-o://d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161" gracePeriod=30 Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.006774 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" containerID="cri-o://366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82" gracePeriod=30 Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.742393 4870 generic.go:334] "Generic (PLEG): container finished" podID="a1de7242-d69a-4e86-8461-a771c855adf9" containerID="91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24" exitCode=143 Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.742476 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerDied","Data":"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24"} Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.744674 4870 generic.go:334] "Generic (PLEG): container finished" podID="dcdab968-579c-4189-87c5-05bad5469d6c" containerID="d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161" exitCode=143 Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.744694 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerDied","Data":"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161"} Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.773081 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.880069 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": read tcp 10.217.0.2:33858->10.217.0.216:8775: read: connection reset by peer" Jan 30 08:31:44 crc kubenswrapper[4870]: I0130 08:31:44.880069 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": read tcp 10.217.0.2:33850->10.217.0.216:8775: read: connection reset by peer" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.240992 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.353013 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.360575 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs\") pod \"dcdab968-579c-4189-87c5-05bad5469d6c\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.360696 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle\") pod \"dcdab968-579c-4189-87c5-05bad5469d6c\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.360822 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs\") pod \"dcdab968-579c-4189-87c5-05bad5469d6c\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.360910 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data\") pod \"dcdab968-579c-4189-87c5-05bad5469d6c\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.360938 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd5ql\" (UniqueName: \"kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql\") pod \"dcdab968-579c-4189-87c5-05bad5469d6c\" (UID: \"dcdab968-579c-4189-87c5-05bad5469d6c\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.361770 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs" (OuterVolumeSpecName: "logs") pod "dcdab968-579c-4189-87c5-05bad5469d6c" (UID: "dcdab968-579c-4189-87c5-05bad5469d6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.367416 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql" (OuterVolumeSpecName: "kube-api-access-hd5ql") pod "dcdab968-579c-4189-87c5-05bad5469d6c" (UID: "dcdab968-579c-4189-87c5-05bad5469d6c"). InnerVolumeSpecName "kube-api-access-hd5ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.414910 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data" (OuterVolumeSpecName: "config-data") pod "dcdab968-579c-4189-87c5-05bad5469d6c" (UID: "dcdab968-579c-4189-87c5-05bad5469d6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.436818 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcdab968-579c-4189-87c5-05bad5469d6c" (UID: "dcdab968-579c-4189-87c5-05bad5469d6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.439451 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dcdab968-579c-4189-87c5-05bad5469d6c" (UID: "dcdab968-579c-4189-87c5-05bad5469d6c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.463260 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data\") pod \"a1de7242-d69a-4e86-8461-a771c855adf9\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.463702 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs\") pod \"a1de7242-d69a-4e86-8461-a771c855adf9\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.463820 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6784l\" (UniqueName: \"kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l\") pod \"a1de7242-d69a-4e86-8461-a771c855adf9\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.463901 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle\") pod \"a1de7242-d69a-4e86-8461-a771c855adf9\" (UID: \"a1de7242-d69a-4e86-8461-a771c855adf9\") " Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464584 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs" (OuterVolumeSpecName: "logs") pod "a1de7242-d69a-4e86-8461-a771c855adf9" (UID: "a1de7242-d69a-4e86-8461-a771c855adf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464754 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464775 4870 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464788 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdab968-579c-4189-87c5-05bad5469d6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464797 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd5ql\" (UniqueName: \"kubernetes.io/projected/dcdab968-579c-4189-87c5-05bad5469d6c-kube-api-access-hd5ql\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464807 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdab968-579c-4189-87c5-05bad5469d6c-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.464816 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1de7242-d69a-4e86-8461-a771c855adf9-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.467111 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l" (OuterVolumeSpecName: "kube-api-access-6784l") pod "a1de7242-d69a-4e86-8461-a771c855adf9" (UID: "a1de7242-d69a-4e86-8461-a771c855adf9"). InnerVolumeSpecName "kube-api-access-6784l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.492302 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data" (OuterVolumeSpecName: "config-data") pod "a1de7242-d69a-4e86-8461-a771c855adf9" (UID: "a1de7242-d69a-4e86-8461-a771c855adf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.492418 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1de7242-d69a-4e86-8461-a771c855adf9" (UID: "a1de7242-d69a-4e86-8461-a771c855adf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.567200 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.567240 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6784l\" (UniqueName: \"kubernetes.io/projected/a1de7242-d69a-4e86-8461-a771c855adf9-kube-api-access-6784l\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.567250 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1de7242-d69a-4e86-8461-a771c855adf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.756259 4870 generic.go:334] "Generic (PLEG): container finished" podID="a1de7242-d69a-4e86-8461-a771c855adf9" containerID="2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d" exitCode=0 Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.756308 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.756373 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerDied","Data":"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d"} Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.756442 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1de7242-d69a-4e86-8461-a771c855adf9","Type":"ContainerDied","Data":"e2b4d49702eb3a18798f8e9ef5ef000c9c2ecb3963af90a645e06a8320f3524e"} Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.756473 4870 scope.go:117] "RemoveContainer" containerID="2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.768963 4870 generic.go:334] "Generic (PLEG): container finished" podID="dcdab968-579c-4189-87c5-05bad5469d6c" containerID="366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82" exitCode=0 Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.768997 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerDied","Data":"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82"} Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.769020 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.769021 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdab968-579c-4189-87c5-05bad5469d6c","Type":"ContainerDied","Data":"01a2218e5e8fc92f78dfaa53cf3e950822f8a4a9869c349d33052df56fc52370"} Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.790531 4870 scope.go:117] "RemoveContainer" containerID="91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.795743 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.806501 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.816901 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.819778 4870 scope.go:117] "RemoveContainer" containerID="2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.821006 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d\": container with ID starting with 2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d not found: ID does not exist" containerID="2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.821076 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d"} err="failed to get container status \"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d\": rpc error: code = NotFound desc = could not find container \"2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d\": container with ID starting with 2975101aa8a5b490aa2e1e0a7bd6ecd8816f9596dafd908075f0463bf0a9804d not found: ID does not exist" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.821125 4870 scope.go:117] "RemoveContainer" containerID="91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.821587 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24\": container with ID starting with 91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24 not found: ID does not exist" containerID="91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.821644 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24"} err="failed to get container status \"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24\": rpc error: code = NotFound desc = could not find container \"91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24\": container with ID starting with 91208fdae46fcfb2ca1ade5a3e1bc413f85090f1c513fef0d1f92dd39faaad24 not found: ID does not exist" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.821682 4870 scope.go:117] "RemoveContainer" containerID="366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.826553 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834265 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.834769 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834785 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.834803 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-api" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834811 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-api" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.834831 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-log" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834837 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-log" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.834854 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834864 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.834890 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dfb454-58dc-4c83-b25e-cabaab6cb747" containerName="nova-manage" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.834897 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dfb454-58dc-4c83-b25e-cabaab6cb747" containerName="nova-manage" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.835081 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-log" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.835101 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-metadata" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.835148 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" containerName="nova-api-api" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.835160 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" containerName="nova-metadata-log" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.835173 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1dfb454-58dc-4c83-b25e-cabaab6cb747" containerName="nova-manage" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.836245 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.838578 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.843272 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.845297 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.852115 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.852375 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.867847 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.878986 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.879382 4870 scope.go:117] "RemoveContainer" containerID="d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.924294 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.925972 4870 scope.go:117] "RemoveContainer" containerID="366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.926507 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.926668 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82\": container with ID starting with 366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82 not found: ID does not exist" containerID="366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.926705 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82"} err="failed to get container status \"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82\": rpc error: code = NotFound desc = could not find container \"366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82\": container with ID starting with 366fec4d0fab4df6c701136c8cfd06595e8cf3699db478403889218f9cdeaf82 not found: ID does not exist" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.926730 4870 scope.go:117] "RemoveContainer" containerID="d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.927052 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161\": container with ID starting with d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161 not found: ID does not exist" containerID="d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.927088 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161"} err="failed to get container status \"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161\": rpc error: code = NotFound desc = could not find container \"d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161\": container with ID starting with d24242e6d0acaec6d8bd3dd1615c1d42924eeb6b050411d36a27e15a6dedb161 not found: ID does not exist" Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.929086 4870 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 08:31:45 crc kubenswrapper[4870]: E0130 08:31:45.929133 4870 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerName="nova-scheduler-scheduler" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.981853 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdea203-220a-457e-b00f-61b48afc7329-logs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982022 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-config-data\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982097 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982279 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmpzp\" (UniqueName: \"kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982491 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982812 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982898 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvh5\" (UniqueName: \"kubernetes.io/projected/ccdea203-220a-457e-b00f-61b48afc7329-kube-api-access-gnvh5\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982943 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:45 crc kubenswrapper[4870]: I0130 08:31:45.982996 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085351 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085390 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdea203-220a-457e-b00f-61b48afc7329-logs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085436 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-config-data\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085461 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085512 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmpzp\" (UniqueName: \"kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085555 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085618 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085643 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvh5\" (UniqueName: \"kubernetes.io/projected/ccdea203-220a-457e-b00f-61b48afc7329-kube-api-access-gnvh5\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.085663 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.087068 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.087070 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1de7242-d69a-4e86-8461-a771c855adf9" path="/var/lib/kubelet/pods/a1de7242-d69a-4e86-8461-a771c855adf9/volumes" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.087300 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdea203-220a-457e-b00f-61b48afc7329-logs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.087771 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcdab968-579c-4189-87c5-05bad5469d6c" path="/var/lib/kubelet/pods/dcdab968-579c-4189-87c5-05bad5469d6c/volumes" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.089652 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.090048 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.090627 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-config-data\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.091934 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.092434 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdea203-220a-457e-b00f-61b48afc7329-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.105429 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvh5\" (UniqueName: \"kubernetes.io/projected/ccdea203-220a-457e-b00f-61b48afc7329-kube-api-access-gnvh5\") pod \"nova-metadata-0\" (UID: \"ccdea203-220a-457e-b00f-61b48afc7329\") " pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.105779 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmpzp\" (UniqueName: \"kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp\") pod \"nova-api-0\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.167931 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.179919 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.695149 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.756469 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:31:46 crc kubenswrapper[4870]: W0130 08:31:46.759426 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23cc5a83_d937_4f75_8256_e2ea77e8fe0a.slice/crio-6699255e6904997ff78ba3f0df7576bf2779f5c9622dac09638311ffe5a134e0 WatchSource:0}: Error finding container 6699255e6904997ff78ba3f0df7576bf2779f5c9622dac09638311ffe5a134e0: Status 404 returned error can't find the container with id 6699255e6904997ff78ba3f0df7576bf2779f5c9622dac09638311ffe5a134e0 Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.786355 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerStarted","Data":"6699255e6904997ff78ba3f0df7576bf2779f5c9622dac09638311ffe5a134e0"} Jan 30 08:31:46 crc kubenswrapper[4870]: I0130 08:31:46.789019 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccdea203-220a-457e-b00f-61b48afc7329","Type":"ContainerStarted","Data":"06420277392a8c95a7a0decdb3bfa5fa2b5bd8d8fb05e11deef7fc819e13647c"} Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.801104 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerStarted","Data":"cb9bff710edf11f4d70de2ac074768c3d03a94425b5a99d8d702b4bc10eb9597"} Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.801456 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerStarted","Data":"41ef7bc61259edffa4f8c80705a3d4d0d4a5ae2248d6e06295979bab4e16fc83"} Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.804658 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccdea203-220a-457e-b00f-61b48afc7329","Type":"ContainerStarted","Data":"1cc055e105ed441f83fd209b5650e3acb7acad6bae95d0a2dc677548d12ed7ab"} Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.804740 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccdea203-220a-457e-b00f-61b48afc7329","Type":"ContainerStarted","Data":"f2ed744c83bd703945e51ea336f9496ba896a26dc498f4d3970e4b131f486f11"} Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.831936 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.831908677 podStartE2EDuration="2.831908677s" podCreationTimestamp="2026-01-30 08:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:47.830794813 +0000 UTC m=+1346.526341932" watchObservedRunningTime="2026-01-30 08:31:47.831908677 +0000 UTC m=+1346.527455796" Jan 30 08:31:47 crc kubenswrapper[4870]: I0130 08:31:47.863069 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.863045369 podStartE2EDuration="2.863045369s" podCreationTimestamp="2026-01-30 08:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:47.857960629 +0000 UTC m=+1346.553507768" watchObservedRunningTime="2026-01-30 08:31:47.863045369 +0000 UTC m=+1346.558592488" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.281182 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.356101 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data\") pod \"7769eb04-0ff3-41ef-9977-e66563ea4085\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.356157 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nzwt\" (UniqueName: \"kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt\") pod \"7769eb04-0ff3-41ef-9977-e66563ea4085\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.356289 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle\") pod \"7769eb04-0ff3-41ef-9977-e66563ea4085\" (UID: \"7769eb04-0ff3-41ef-9977-e66563ea4085\") " Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.360771 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt" (OuterVolumeSpecName: "kube-api-access-4nzwt") pod "7769eb04-0ff3-41ef-9977-e66563ea4085" (UID: "7769eb04-0ff3-41ef-9977-e66563ea4085"). InnerVolumeSpecName "kube-api-access-4nzwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.382470 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7769eb04-0ff3-41ef-9977-e66563ea4085" (UID: "7769eb04-0ff3-41ef-9977-e66563ea4085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.395532 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data" (OuterVolumeSpecName: "config-data") pod "7769eb04-0ff3-41ef-9977-e66563ea4085" (UID: "7769eb04-0ff3-41ef-9977-e66563ea4085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.460318 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.460363 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nzwt\" (UniqueName: \"kubernetes.io/projected/7769eb04-0ff3-41ef-9977-e66563ea4085-kube-api-access-4nzwt\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.460378 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7769eb04-0ff3-41ef-9977-e66563ea4085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.895033 4870 generic.go:334] "Generic (PLEG): container finished" podID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" exitCode=0 Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.895084 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7769eb04-0ff3-41ef-9977-e66563ea4085","Type":"ContainerDied","Data":"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66"} Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.895110 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7769eb04-0ff3-41ef-9977-e66563ea4085","Type":"ContainerDied","Data":"e5d61cbfee6394ab7139462d11ff3290876e24938531670c58f9dd81e3a55b6c"} Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.895127 4870 scope.go:117] "RemoveContainer" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.895140 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.929330 4870 scope.go:117] "RemoveContainer" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" Jan 30 08:31:49 crc kubenswrapper[4870]: E0130 08:31:49.929764 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66\": container with ID starting with aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66 not found: ID does not exist" containerID="aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.929792 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66"} err="failed to get container status \"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66\": rpc error: code = NotFound desc = could not find container \"aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66\": container with ID starting with aae79fe40acd6f2726aec10f649836526fb8e19a3f4e4e131c09bcefac9d4c66 not found: ID does not exist" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.945603 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.961790 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.975700 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:49 crc kubenswrapper[4870]: E0130 08:31:49.976114 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerName="nova-scheduler-scheduler" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.976125 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerName="nova-scheduler-scheduler" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.976298 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" containerName="nova-scheduler-scheduler" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.976832 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.976913 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:49 crc kubenswrapper[4870]: I0130 08:31:49.988656 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.086279 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7769eb04-0ff3-41ef-9977-e66563ea4085" path="/var/lib/kubelet/pods/7769eb04-0ff3-41ef-9977-e66563ea4085/volumes" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.087655 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.087731 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jfc\" (UniqueName: \"kubernetes.io/projected/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-kube-api-access-68jfc\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.087892 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.190029 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.190247 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68jfc\" (UniqueName: \"kubernetes.io/projected/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-kube-api-access-68jfc\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.190459 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.196782 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.199303 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.211992 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jfc\" (UniqueName: \"kubernetes.io/projected/ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6-kube-api-access-68jfc\") pod \"nova-scheduler-0\" (UID: \"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6\") " pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.316317 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.865011 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 08:31:50 crc kubenswrapper[4870]: I0130 08:31:50.908791 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6","Type":"ContainerStarted","Data":"658a5390562c6f35ef0f73da7eb87bdc477c4b9177074bae8403787f752b2ffd"} Jan 30 08:31:51 crc kubenswrapper[4870]: I0130 08:31:51.180315 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 08:31:51 crc kubenswrapper[4870]: I0130 08:31:51.180634 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 08:31:51 crc kubenswrapper[4870]: I0130 08:31:51.922787 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6","Type":"ContainerStarted","Data":"a0d9e18e9847c78a60b7232ac124e6b11ea0da185c6df5064334480e32604d14"} Jan 30 08:31:51 crc kubenswrapper[4870]: I0130 08:31:51.958697 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9586743909999997 podStartE2EDuration="2.958674391s" podCreationTimestamp="2026-01-30 08:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:31:51.942750239 +0000 UTC m=+1350.638297368" watchObservedRunningTime="2026-01-30 08:31:51.958674391 +0000 UTC m=+1350.654221510" Jan 30 08:31:55 crc kubenswrapper[4870]: I0130 08:31:55.250379 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:31:55 crc kubenswrapper[4870]: I0130 08:31:55.250796 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:31:55 crc kubenswrapper[4870]: I0130 08:31:55.317565 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 08:31:56 crc kubenswrapper[4870]: I0130 08:31:56.168655 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:31:56 crc kubenswrapper[4870]: I0130 08:31:56.168736 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:31:56 crc kubenswrapper[4870]: I0130 08:31:56.180773 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 08:31:56 crc kubenswrapper[4870]: I0130 08:31:56.180850 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 08:31:57 crc kubenswrapper[4870]: I0130 08:31:57.251215 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:57 crc kubenswrapper[4870]: I0130 08:31:57.267179 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:57 crc kubenswrapper[4870]: I0130 08:31:57.267235 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ccdea203-220a-457e-b00f-61b48afc7329" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 08:31:57 crc kubenswrapper[4870]: I0130 08:31:57.267626 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ccdea203-220a-457e-b00f-61b48afc7329" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 08:32:00 crc kubenswrapper[4870]: I0130 08:32:00.316745 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 08:32:00 crc kubenswrapper[4870]: I0130 08:32:00.373211 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 08:32:01 crc kubenswrapper[4870]: I0130 08:32:01.085002 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.179982 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.180822 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.181859 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.181929 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.190970 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.193358 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.194071 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.196358 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.203385 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.461625 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.465185 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.476360 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567204 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567331 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567581 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gv2\" (UniqueName: \"kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567633 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.567776 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670094 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670153 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670188 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670242 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gv2\" (UniqueName: \"kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670260 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.670298 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.671114 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.671216 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.671456 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.671492 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.671542 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.695394 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gv2\" (UniqueName: \"kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2\") pod \"dnsmasq-dns-6999845677-vd26g\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:06 crc kubenswrapper[4870]: I0130 08:32:06.790992 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:07 crc kubenswrapper[4870]: I0130 08:32:07.117679 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 08:32:07 crc kubenswrapper[4870]: I0130 08:32:07.283019 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.123970 4870 generic.go:334] "Generic (PLEG): container finished" podID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerID="cf4692acee92608a7992da7d8327f9e59bf6302ddd00cf4b1c51b56d002d56e2" exitCode=0 Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.124058 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999845677-vd26g" event={"ID":"8a0f9be1-926a-4340-9f05-ba673e3e471e","Type":"ContainerDied","Data":"cf4692acee92608a7992da7d8327f9e59bf6302ddd00cf4b1c51b56d002d56e2"} Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.124515 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999845677-vd26g" event={"ID":"8a0f9be1-926a-4340-9f05-ba673e3e471e","Type":"ContainerStarted","Data":"b68353bb62e0c010f98866001c94c9fbd25b787dc4392f6cd92563052dd236dc"} Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.786512 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.799304 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.799581 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-central-agent" containerID="cri-o://718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531" gracePeriod=30 Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.799648 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="proxy-httpd" containerID="cri-o://74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a" gracePeriod=30 Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.799710 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="sg-core" containerID="cri-o://75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5" gracePeriod=30 Jan 30 08:32:08 crc kubenswrapper[4870]: I0130 08:32:08.799982 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-notification-agent" containerID="cri-o://919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2" gracePeriod=30 Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.139818 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999845677-vd26g" event={"ID":"8a0f9be1-926a-4340-9f05-ba673e3e471e","Type":"ContainerStarted","Data":"478f243e0d74f6dbf93b850491c64bcf3ea2a501bedee4e704be06e6e754b799"} Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.140851 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.147753 4870 generic.go:334] "Generic (PLEG): container finished" podID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerID="74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a" exitCode=0 Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.147792 4870 generic.go:334] "Generic (PLEG): container finished" podID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerID="75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5" exitCode=2 Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.147829 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerDied","Data":"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a"} Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.147872 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerDied","Data":"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5"} Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.148088 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-log" containerID="cri-o://41ef7bc61259edffa4f8c80705a3d4d0d4a5ae2248d6e06295979bab4e16fc83" gracePeriod=30 Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.148107 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-api" containerID="cri-o://cb9bff710edf11f4d70de2ac074768c3d03a94425b5a99d8d702b4bc10eb9597" gracePeriod=30 Jan 30 08:32:09 crc kubenswrapper[4870]: I0130 08:32:09.165331 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6999845677-vd26g" podStartSLOduration=3.165313673 podStartE2EDuration="3.165313673s" podCreationTimestamp="2026-01-30 08:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:32:09.160424019 +0000 UTC m=+1367.855971138" watchObservedRunningTime="2026-01-30 08:32:09.165313673 +0000 UTC m=+1367.860860782" Jan 30 08:32:10 crc kubenswrapper[4870]: I0130 08:32:10.158542 4870 generic.go:334] "Generic (PLEG): container finished" podID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerID="718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531" exitCode=0 Jan 30 08:32:10 crc kubenswrapper[4870]: I0130 08:32:10.158626 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerDied","Data":"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531"} Jan 30 08:32:10 crc kubenswrapper[4870]: I0130 08:32:10.160124 4870 generic.go:334] "Generic (PLEG): container finished" podID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerID="41ef7bc61259edffa4f8c80705a3d4d0d4a5ae2248d6e06295979bab4e16fc83" exitCode=143 Jan 30 08:32:10 crc kubenswrapper[4870]: I0130 08:32:10.160205 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerDied","Data":"41ef7bc61259edffa4f8c80705a3d4d0d4a5ae2248d6e06295979bab4e16fc83"} Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.178442 4870 generic.go:334] "Generic (PLEG): container finished" podID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerID="cb9bff710edf11f4d70de2ac074768c3d03a94425b5a99d8d702b4bc10eb9597" exitCode=0 Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.179584 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerDied","Data":"cb9bff710edf11f4d70de2ac074768c3d03a94425b5a99d8d702b4bc10eb9597"} Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.360401 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.481262 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs" (OuterVolumeSpecName: "logs") pod "23cc5a83-d937-4f75-8256-e2ea77e8fe0a" (UID: "23cc5a83-d937-4f75-8256-e2ea77e8fe0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.481371 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs\") pod \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.481550 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data\") pod \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.481592 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle\") pod \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.481632 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmpzp\" (UniqueName: \"kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp\") pod \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\" (UID: \"23cc5a83-d937-4f75-8256-e2ea77e8fe0a\") " Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.482151 4870 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-logs\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.487487 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp" (OuterVolumeSpecName: "kube-api-access-tmpzp") pod "23cc5a83-d937-4f75-8256-e2ea77e8fe0a" (UID: "23cc5a83-d937-4f75-8256-e2ea77e8fe0a"). InnerVolumeSpecName "kube-api-access-tmpzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.513801 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data" (OuterVolumeSpecName: "config-data") pod "23cc5a83-d937-4f75-8256-e2ea77e8fe0a" (UID: "23cc5a83-d937-4f75-8256-e2ea77e8fe0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.519306 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23cc5a83-d937-4f75-8256-e2ea77e8fe0a" (UID: "23cc5a83-d937-4f75-8256-e2ea77e8fe0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.584302 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.584341 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:11 crc kubenswrapper[4870]: I0130 08:32:11.584356 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmpzp\" (UniqueName: \"kubernetes.io/projected/23cc5a83-d937-4f75-8256-e2ea77e8fe0a-kube-api-access-tmpzp\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.189915 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23cc5a83-d937-4f75-8256-e2ea77e8fe0a","Type":"ContainerDied","Data":"6699255e6904997ff78ba3f0df7576bf2779f5c9622dac09638311ffe5a134e0"} Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.189971 4870 scope.go:117] "RemoveContainer" containerID="cb9bff710edf11f4d70de2ac074768c3d03a94425b5a99d8d702b4bc10eb9597" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.190119 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.214823 4870 scope.go:117] "RemoveContainer" containerID="41ef7bc61259edffa4f8c80705a3d4d0d4a5ae2248d6e06295979bab4e16fc83" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.225792 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.248081 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.261666 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:12 crc kubenswrapper[4870]: E0130 08:32:12.262272 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-log" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.262301 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-log" Jan 30 08:32:12 crc kubenswrapper[4870]: E0130 08:32:12.262335 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-api" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.262344 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-api" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.262600 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-log" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.262630 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" containerName="nova-api-api" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.263855 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.265477 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.267331 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.267596 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.281349 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.401783 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g4s\" (UniqueName: \"kubernetes.io/projected/ed40aa22-a330-46ab-9971-39e764e63ff7-kube-api-access-25g4s\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.402112 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed40aa22-a330-46ab-9971-39e764e63ff7-logs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.402188 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-config-data\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.402284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.402327 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.402555 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.504727 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-config-data\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.504800 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.504823 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.505208 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.505276 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g4s\" (UniqueName: \"kubernetes.io/projected/ed40aa22-a330-46ab-9971-39e764e63ff7-kube-api-access-25g4s\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.505358 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed40aa22-a330-46ab-9971-39e764e63ff7-logs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.507294 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed40aa22-a330-46ab-9971-39e764e63ff7-logs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.512311 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-config-data\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.512695 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.522102 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.525397 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed40aa22-a330-46ab-9971-39e764e63ff7-public-tls-certs\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.531725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g4s\" (UniqueName: \"kubernetes.io/projected/ed40aa22-a330-46ab-9971-39e764e63ff7-kube-api-access-25g4s\") pod \"nova-api-0\" (UID: \"ed40aa22-a330-46ab-9971-39e764e63ff7\") " pod="openstack/nova-api-0" Jan 30 08:32:12 crc kubenswrapper[4870]: I0130 08:32:12.583675 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 08:32:13 crc kubenswrapper[4870]: I0130 08:32:13.112356 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 08:32:13 crc kubenswrapper[4870]: I0130 08:32:13.248157 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed40aa22-a330-46ab-9971-39e764e63ff7","Type":"ContainerStarted","Data":"2a86347911ba5fb18ae9df61916a53aec7e980b31b61f671dff024f66f3d7263"} Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.087694 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23cc5a83-d937-4f75-8256-e2ea77e8fe0a" path="/var/lib/kubelet/pods/23cc5a83-d937-4f75-8256-e2ea77e8fe0a/volumes" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.194137 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243627 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnc7j\" (UniqueName: \"kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243677 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243761 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243833 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243900 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.243959 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.244006 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.244206 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data\") pod \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\" (UID: \"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27\") " Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.244323 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.244659 4870 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.244740 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.250640 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j" (OuterVolumeSpecName: "kube-api-access-hnc7j") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "kube-api-access-hnc7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.258424 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts" (OuterVolumeSpecName: "scripts") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.273288 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed40aa22-a330-46ab-9971-39e764e63ff7","Type":"ContainerStarted","Data":"9d25fed9d6ec793acd29075f40a1362f97f3f12efc8a4fd33062711db8a9bb39"} Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.276143 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ed40aa22-a330-46ab-9971-39e764e63ff7","Type":"ContainerStarted","Data":"a2fe1d66955d487443e012d436d6f7b8bfa1169a936e81749a67c465204fa22a"} Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.288632 4870 generic.go:334] "Generic (PLEG): container finished" podID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerID="919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2" exitCode=0 Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.288937 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerDied","Data":"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2"} Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.289045 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe5bff0c-1f97-4c2d-9f95-46c7c3799d27","Type":"ContainerDied","Data":"878d9967ec40148177071222b9ab0dc547347647130442c6f0bf2d56dd31e4d4"} Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.289133 4870 scope.go:117] "RemoveContainer" containerID="74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.289375 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.290197 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.307049 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.307027495 podStartE2EDuration="2.307027495s" podCreationTimestamp="2026-01-30 08:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:32:14.299737244 +0000 UTC m=+1372.995284353" watchObservedRunningTime="2026-01-30 08:32:14.307027495 +0000 UTC m=+1373.002574614" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.331206 4870 scope.go:117] "RemoveContainer" containerID="75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.336136 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.347460 4870 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.347491 4870 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.347503 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnc7j\" (UniqueName: \"kubernetes.io/projected/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-kube-api-access-hnc7j\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.347513 4870 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.347523 4870 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.348667 4870 scope.go:117] "RemoveContainer" containerID="919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.367788 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.373056 4870 scope.go:117] "RemoveContainer" containerID="718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.393196 4870 scope.go:117] "RemoveContainer" containerID="74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.393706 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a\": container with ID starting with 74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a not found: ID does not exist" containerID="74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.393749 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a"} err="failed to get container status \"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a\": rpc error: code = NotFound desc = could not find container \"74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a\": container with ID starting with 74609d48db351a4551e4a14571d8ceeb785d76a089cbc5d4cef7533f25e42b9a not found: ID does not exist" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.393779 4870 scope.go:117] "RemoveContainer" containerID="75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.394190 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5\": container with ID starting with 75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5 not found: ID does not exist" containerID="75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.394226 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5"} err="failed to get container status \"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5\": rpc error: code = NotFound desc = could not find container \"75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5\": container with ID starting with 75846033cc04fdf75214f4cc82a37a97af903516f1d459b619c65d4c2c05c4e5 not found: ID does not exist" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.394253 4870 scope.go:117] "RemoveContainer" containerID="919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.394556 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2\": container with ID starting with 919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2 not found: ID does not exist" containerID="919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.394585 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2"} err="failed to get container status \"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2\": rpc error: code = NotFound desc = could not find container \"919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2\": container with ID starting with 919ec938ff2a2a8f2ab814ba8d607fde425c77054e1f0ed868c053bea67814e2 not found: ID does not exist" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.394608 4870 scope.go:117] "RemoveContainer" containerID="718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.394845 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531\": container with ID starting with 718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531 not found: ID does not exist" containerID="718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.395045 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531"} err="failed to get container status \"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531\": rpc error: code = NotFound desc = could not find container \"718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531\": container with ID starting with 718104975a12146cc00fc56171ab65a7bdbb1698df506296389cd8521d23c531 not found: ID does not exist" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.407438 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data" (OuterVolumeSpecName: "config-data") pod "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" (UID: "fe5bff0c-1f97-4c2d-9f95-46c7c3799d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.449888 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.450109 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.626460 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.639310 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.653782 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.654324 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-central-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654346 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-central-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.654364 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="proxy-httpd" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654372 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="proxy-httpd" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.654383 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-notification-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654392 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-notification-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: E0130 08:32:14.654407 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="sg-core" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654414 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="sg-core" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654653 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="sg-core" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654916 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-central-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654949 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="ceilometer-notification-agent" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.654966 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" containerName="proxy-httpd" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.657349 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.661460 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.663015 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.663244 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.673111 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756030 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-run-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756098 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-scripts\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756128 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-config-data\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756166 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-log-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756218 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756246 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxzb\" (UniqueName: \"kubernetes.io/projected/0944a474-a4a5-4ff7-95cf-cd783c051a16-kube-api-access-kdxzb\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.756327 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.858515 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-config-data\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859137 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-log-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859228 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859266 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxzb\" (UniqueName: \"kubernetes.io/projected/0944a474-a4a5-4ff7-95cf-cd783c051a16-kube-api-access-kdxzb\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859343 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859422 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859486 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-run-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.859570 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-scripts\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.860068 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-log-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.860375 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0944a474-a4a5-4ff7-95cf-cd783c051a16-run-httpd\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.866860 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.867155 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.867564 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-config-data\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.869167 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-scripts\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.882437 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0944a474-a4a5-4ff7-95cf-cd783c051a16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.889666 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxzb\" (UniqueName: \"kubernetes.io/projected/0944a474-a4a5-4ff7-95cf-cd783c051a16-kube-api-access-kdxzb\") pod \"ceilometer-0\" (UID: \"0944a474-a4a5-4ff7-95cf-cd783c051a16\") " pod="openstack/ceilometer-0" Jan 30 08:32:14 crc kubenswrapper[4870]: I0130 08:32:14.971572 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 08:32:15 crc kubenswrapper[4870]: W0130 08:32:15.455331 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0944a474_a4a5_4ff7_95cf_cd783c051a16.slice/crio-750b36c18f29317c6527340cadcc3a4ba0e29ace741d3132fcf943a00dce946a WatchSource:0}: Error finding container 750b36c18f29317c6527340cadcc3a4ba0e29ace741d3132fcf943a00dce946a: Status 404 returned error can't find the container with id 750b36c18f29317c6527340cadcc3a4ba0e29ace741d3132fcf943a00dce946a Jan 30 08:32:15 crc kubenswrapper[4870]: I0130 08:32:15.460159 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.085703 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5bff0c-1f97-4c2d-9f95-46c7c3799d27" path="/var/lib/kubelet/pods/fe5bff0c-1f97-4c2d-9f95-46c7c3799d27/volumes" Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.309736 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0944a474-a4a5-4ff7-95cf-cd783c051a16","Type":"ContainerStarted","Data":"d4557d1fa9418b5f4939daed4023af55cb89d4096102f6bba8e64b01ea09b0cc"} Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.309776 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0944a474-a4a5-4ff7-95cf-cd783c051a16","Type":"ContainerStarted","Data":"750b36c18f29317c6527340cadcc3a4ba0e29ace741d3132fcf943a00dce946a"} Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.793049 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.846676 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:32:16 crc kubenswrapper[4870]: I0130 08:32:16.846921 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7777964479-kzgv2" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="dnsmasq-dns" containerID="cri-o://fa09eeaef0e8d067370ba4e9a769247437b75f3bbc783ef72b9b39a713b37db0" gracePeriod=10 Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.326018 4870 generic.go:334] "Generic (PLEG): container finished" podID="d5925267-e75f-4398-af96-6856710c57f3" containerID="fa09eeaef0e8d067370ba4e9a769247437b75f3bbc783ef72b9b39a713b37db0" exitCode=0 Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.326104 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerDied","Data":"fa09eeaef0e8d067370ba4e9a769247437b75f3bbc783ef72b9b39a713b37db0"} Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.326386 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7777964479-kzgv2" event={"ID":"d5925267-e75f-4398-af96-6856710c57f3","Type":"ContainerDied","Data":"0d8401900436a6761c400a0be0a0bdd42a9aa8031b291c68c5469cef3ec4cd02"} Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.326401 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8401900436a6761c400a0be0a0bdd42a9aa8031b291c68c5469cef3ec4cd02" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.328904 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0944a474-a4a5-4ff7-95cf-cd783c051a16","Type":"ContainerStarted","Data":"d46ec846e503baec4f5ee7cc68d33f2537579aee1d790bfccd66ca9508546887"} Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.409277 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509045 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509392 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509529 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509563 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br4sd\" (UniqueName: \"kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509644 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.509689 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb\") pod \"d5925267-e75f-4398-af96-6856710c57f3\" (UID: \"d5925267-e75f-4398-af96-6856710c57f3\") " Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.515996 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd" (OuterVolumeSpecName: "kube-api-access-br4sd") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "kube-api-access-br4sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.565189 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.567527 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.578455 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.596199 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config" (OuterVolumeSpecName: "config") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.604637 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5925267-e75f-4398-af96-6856710c57f3" (UID: "d5925267-e75f-4398-af96-6856710c57f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611684 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611721 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611735 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611748 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611763 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5925267-e75f-4398-af96-6856710c57f3-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:17 crc kubenswrapper[4870]: I0130 08:32:17.611774 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br4sd\" (UniqueName: \"kubernetes.io/projected/d5925267-e75f-4398-af96-6856710c57f3-kube-api-access-br4sd\") on node \"crc\" DevicePath \"\"" Jan 30 08:32:18 crc kubenswrapper[4870]: I0130 08:32:18.343790 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7777964479-kzgv2" Jan 30 08:32:18 crc kubenswrapper[4870]: I0130 08:32:18.343779 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0944a474-a4a5-4ff7-95cf-cd783c051a16","Type":"ContainerStarted","Data":"791a5b366987d43d526b4f900e7e9b4645980fa599dfbf26ae5ef4bb43b58752"} Jan 30 08:32:18 crc kubenswrapper[4870]: I0130 08:32:18.372648 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:32:18 crc kubenswrapper[4870]: I0130 08:32:18.386507 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7777964479-kzgv2"] Jan 30 08:32:19 crc kubenswrapper[4870]: I0130 08:32:19.363627 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0944a474-a4a5-4ff7-95cf-cd783c051a16","Type":"ContainerStarted","Data":"a316b62987c4d89c420cb63d6dd7a4ee879dba25469888ddee99ccb8f6317ee1"} Jan 30 08:32:19 crc kubenswrapper[4870]: I0130 08:32:19.364090 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 08:32:19 crc kubenswrapper[4870]: I0130 08:32:19.394759 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7609742800000001 podStartE2EDuration="5.394736781s" podCreationTimestamp="2026-01-30 08:32:14 +0000 UTC" firstStartedPulling="2026-01-30 08:32:15.458451797 +0000 UTC m=+1374.153998916" lastFinishedPulling="2026-01-30 08:32:19.092214308 +0000 UTC m=+1377.787761417" observedRunningTime="2026-01-30 08:32:19.389991792 +0000 UTC m=+1378.085538911" watchObservedRunningTime="2026-01-30 08:32:19.394736781 +0000 UTC m=+1378.090283900" Jan 30 08:32:20 crc kubenswrapper[4870]: I0130 08:32:20.085315 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5925267-e75f-4398-af96-6856710c57f3" path="/var/lib/kubelet/pods/d5925267-e75f-4398-af96-6856710c57f3/volumes" Jan 30 08:32:22 crc kubenswrapper[4870]: I0130 08:32:22.584713 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:32:22 crc kubenswrapper[4870]: I0130 08:32:22.585860 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 08:32:23 crc kubenswrapper[4870]: I0130 08:32:23.602145 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed40aa22-a330-46ab-9971-39e764e63ff7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 08:32:23 crc kubenswrapper[4870]: I0130 08:32:23.602231 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ed40aa22-a330-46ab-9971-39e764e63ff7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.250095 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.250473 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.250540 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.251482 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.251579 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844" gracePeriod=600 Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.434742 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844" exitCode=0 Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.434798 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844"} Jan 30 08:32:25 crc kubenswrapper[4870]: I0130 08:32:25.434844 4870 scope.go:117] "RemoveContainer" containerID="736e1ea4b0b2b4fa81dc9ec4fa9950e05f221b62734f7e8de7d7969e9158f7ae" Jan 30 08:32:26 crc kubenswrapper[4870]: I0130 08:32:26.452703 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49"} Jan 30 08:32:32 crc kubenswrapper[4870]: I0130 08:32:32.600691 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 08:32:32 crc kubenswrapper[4870]: I0130 08:32:32.601956 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 08:32:32 crc kubenswrapper[4870]: I0130 08:32:32.615798 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 08:32:32 crc kubenswrapper[4870]: I0130 08:32:32.620058 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 08:32:33 crc kubenswrapper[4870]: I0130 08:32:33.551416 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 08:32:33 crc kubenswrapper[4870]: I0130 08:32:33.571498 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 08:32:44 crc kubenswrapper[4870]: I0130 08:32:44.990828 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.484952 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:32:49 crc kubenswrapper[4870]: E0130 08:32:49.485873 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="init" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.485901 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="init" Jan 30 08:32:49 crc kubenswrapper[4870]: E0130 08:32:49.485933 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="dnsmasq-dns" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.485939 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="dnsmasq-dns" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.486111 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5925267-e75f-4398-af96-6856710c57f3" containerName="dnsmasq-dns" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.487669 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.513795 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.522929 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.523097 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.523213 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mqzj\" (UniqueName: \"kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.625514 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.625789 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mqzj\" (UniqueName: \"kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.626079 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.626144 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.626337 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.644512 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mqzj\" (UniqueName: \"kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj\") pod \"redhat-operators-d7xmh\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:49 crc kubenswrapper[4870]: I0130 08:32:49.806852 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:50 crc kubenswrapper[4870]: I0130 08:32:50.294231 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:32:50 crc kubenswrapper[4870]: I0130 08:32:50.900837 4870 generic.go:334] "Generic (PLEG): container finished" podID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerID="5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b" exitCode=0 Jan 30 08:32:50 crc kubenswrapper[4870]: I0130 08:32:50.900907 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerDied","Data":"5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b"} Jan 30 08:32:50 crc kubenswrapper[4870]: I0130 08:32:50.900938 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerStarted","Data":"ce7d2fa7b2f490ff295696455cb253c22c42342b96f4aac8e217dc481875d3d8"} Jan 30 08:32:52 crc kubenswrapper[4870]: I0130 08:32:52.934283 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerStarted","Data":"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff"} Jan 30 08:32:53 crc kubenswrapper[4870]: I0130 08:32:53.946297 4870 generic.go:334] "Generic (PLEG): container finished" podID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerID="949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff" exitCode=0 Jan 30 08:32:53 crc kubenswrapper[4870]: I0130 08:32:53.946397 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerDied","Data":"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff"} Jan 30 08:32:54 crc kubenswrapper[4870]: I0130 08:32:54.888979 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:32:55 crc kubenswrapper[4870]: I0130 08:32:55.818217 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:32:58 crc kubenswrapper[4870]: I0130 08:32:58.992217 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerStarted","Data":"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8"} Jan 30 08:32:59 crc kubenswrapper[4870]: I0130 08:32:59.010996 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d7xmh" podStartSLOduration=3.042690969 podStartE2EDuration="10.01097556s" podCreationTimestamp="2026-01-30 08:32:49 +0000 UTC" firstStartedPulling="2026-01-30 08:32:50.903332385 +0000 UTC m=+1409.598879504" lastFinishedPulling="2026-01-30 08:32:57.871616986 +0000 UTC m=+1416.567164095" observedRunningTime="2026-01-30 08:32:59.007529911 +0000 UTC m=+1417.703077030" watchObservedRunningTime="2026-01-30 08:32:59.01097556 +0000 UTC m=+1417.706522669" Jan 30 08:32:59 crc kubenswrapper[4870]: I0130 08:32:59.667714 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="rabbitmq" containerID="cri-o://2d90fc261d4a0e6355b34b516c467ce7b3ce867fbf835cf5614291d45b33a700" gracePeriod=604796 Jan 30 08:32:59 crc kubenswrapper[4870]: I0130 08:32:59.770802 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="rabbitmq" containerID="cri-o://121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f" gracePeriod=604797 Jan 30 08:32:59 crc kubenswrapper[4870]: I0130 08:32:59.811252 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:32:59 crc kubenswrapper[4870]: I0130 08:32:59.811303 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:33:00 crc kubenswrapper[4870]: I0130 08:33:00.884228 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d7xmh" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="registry-server" probeResult="failure" output=< Jan 30 08:33:00 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:33:00 crc kubenswrapper[4870]: > Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.011031 4870 generic.go:334] "Generic (PLEG): container finished" podID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerID="2d90fc261d4a0e6355b34b516c467ce7b3ce867fbf835cf5614291d45b33a700" exitCode=0 Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.011266 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerDied","Data":"2d90fc261d4a0e6355b34b516c467ce7b3ce867fbf835cf5614291d45b33a700"} Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.389986 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502291 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9pp\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502355 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502385 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502412 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502494 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502536 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502579 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502672 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502709 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.502743 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\" (UID: \"97f21b9d-25bf-4a64-94ef-51d83b662ab3\") " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.503355 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.503649 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.503824 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.512997 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp" (OuterVolumeSpecName: "kube-api-access-mn9pp") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "kube-api-access-mn9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.515093 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.518682 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info" (OuterVolumeSpecName: "pod-info") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.526221 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.531777 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.570358 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data" (OuterVolumeSpecName: "config-data") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.605371 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9pp\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-kube-api-access-mn9pp\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.606273 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.609912 4870 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f21b9d-25bf-4a64-94ef-51d83b662ab3-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610005 4870 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610095 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610188 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610270 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610348 4870 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f21b9d-25bf-4a64-94ef-51d83b662ab3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.610447 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.608839 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf" (OuterVolumeSpecName: "server-conf") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.645670 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.712382 4870 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f21b9d-25bf-4a64-94ef-51d83b662ab3-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.712746 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.743262 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "97f21b9d-25bf-4a64-94ef-51d83b662ab3" (UID: "97f21b9d-25bf-4a64-94ef-51d83b662ab3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.814733 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f21b9d-25bf-4a64-94ef-51d83b662ab3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:01 crc kubenswrapper[4870]: I0130 08:33:01.880149 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.019653 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.020838 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021009 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021141 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021250 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021371 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021470 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021576 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021694 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021847 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.021973 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4d8r\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r\") pod \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\" (UID: \"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd\") " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.020437 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.028260 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.029032 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.029274 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.036049 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.042048 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046074 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046179 4870 generic.go:334] "Generic (PLEG): container finished" podID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerID="121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f" exitCode=0 Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046277 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerDied","Data":"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f"} Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046305 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee2f8ed-b030-40fc-90fb-32a4e404b1fd","Type":"ContainerDied","Data":"3f0499acc4a6b0c8f2d313af1131c23462a36b6d1d5cfab2eb6312a0f9c1c357"} Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046315 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046700 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r" (OuterVolumeSpecName: "kube-api-access-g4d8r") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "kube-api-access-g4d8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.046323 4870 scope.go:117] "RemoveContainer" containerID="121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.088118 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f21b9d-25bf-4a64-94ef-51d83b662ab3","Type":"ContainerDied","Data":"e007871b6d10423ef6514301a7948e0b65aeec9e801d811cb06f4a5040316a29"} Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.088227 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.124634 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data" (OuterVolumeSpecName: "config-data") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125344 4870 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125388 4870 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125402 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4d8r\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-kube-api-access-g4d8r\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125417 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125427 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125436 4870 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125448 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125473 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.125485 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.228678 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.230003 4870 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.240710 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.245528 4870 scope.go:117] "RemoveContainer" containerID="15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.288504 4870 scope.go:117] "RemoveContainer" containerID="121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f" Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.292378 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f\": container with ID starting with 121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f not found: ID does not exist" containerID="121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.292426 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f"} err="failed to get container status \"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f\": rpc error: code = NotFound desc = could not find container \"121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f\": container with ID starting with 121cd78e5a4355f003ebf17a16fbee0ff6a2d7322239a3f9d8f72539da950f1f not found: ID does not exist" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.292454 4870 scope.go:117] "RemoveContainer" containerID="15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49" Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.293036 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49\": container with ID starting with 15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49 not found: ID does not exist" containerID="15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.293072 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49"} err="failed to get container status \"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49\": rpc error: code = NotFound desc = could not find container \"15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49\": container with ID starting with 15912521352db9ebeacf4667e3f3fd3607893d70df7becfdb7861d554dec9b49 not found: ID does not exist" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.293110 4870 scope.go:117] "RemoveContainer" containerID="2d90fc261d4a0e6355b34b516c467ce7b3ce867fbf835cf5614291d45b33a700" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.293741 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" (UID: "0ee2f8ed-b030-40fc-90fb-32a4e404b1fd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.312743 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.328713 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.329900 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.330279 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="setup-container" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330297 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="setup-container" Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.330314 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="setup-container" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330321 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="setup-container" Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.330339 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330346 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: E0130 08:33:02.330364 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330371 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330575 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.330593 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" containerName="rabbitmq" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.331540 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.331615 4870 scope.go:117] "RemoveContainer" containerID="55e6a4b3af15640088e3e1927ba88636a5cf35ec532fc2df3395e46ebcf07d79" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.331616 4870 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.331565 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.339617 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.339765 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.340280 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.340403 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.340429 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lwd7k" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.340511 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.340634 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.343786 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.421098 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.430341 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433803 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433851 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drr68\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-kube-api-access-drr68\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433887 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433923 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433949 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.433983 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.434018 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.434040 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.434078 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.434101 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.434156 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.442490 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.444661 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.447525 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.447693 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.447806 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.447960 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.448127 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.448310 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.448362 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hr5rb" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.450232 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536060 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxkqq\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-kube-api-access-bxkqq\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536125 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536269 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536369 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536415 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536521 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536576 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536614 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drr68\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-kube-api-access-drr68\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536638 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536691 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536721 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536748 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536770 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536828 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536896 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536913 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536958 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536979 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.536998 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.537028 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.537067 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.537124 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.537660 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.538162 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.539944 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.540245 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.540313 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.540407 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.540469 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.541612 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.543314 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.545268 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.557561 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drr68\" (UniqueName: \"kubernetes.io/projected/bf05f72e-aa42-4296-a7dc-8b742d6e0aab-kube-api-access-drr68\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.585802 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bf05f72e-aa42-4296-a7dc-8b742d6e0aab\") " pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.638786 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639061 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639160 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639237 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639339 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639404 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639470 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639564 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639639 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxkqq\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-kube-api-access-bxkqq\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639725 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639737 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639891 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.640373 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.641215 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.639823 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.641328 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.641581 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.647953 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.647961 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.648430 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.658585 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.662484 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxkqq\" (UniqueName: \"kubernetes.io/projected/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-kube-api-access-bxkqq\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.664596 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2575ea2c-dc22-4ca2-bf0b-d67eaa330832-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.680631 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2575ea2c-dc22-4ca2-bf0b-d67eaa330832\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:02 crc kubenswrapper[4870]: I0130 08:33:02.776409 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:03 crc kubenswrapper[4870]: W0130 08:33:03.128494 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf05f72e_aa42_4296_a7dc_8b742d6e0aab.slice/crio-84957b48794c6259279a99608747a29ad7394fc9ca1bd6ce1f98a7f4c10a624e WatchSource:0}: Error finding container 84957b48794c6259279a99608747a29ad7394fc9ca1bd6ce1f98a7f4c10a624e: Status 404 returned error can't find the container with id 84957b48794c6259279a99608747a29ad7394fc9ca1bd6ce1f98a7f4c10a624e Jan 30 08:33:03 crc kubenswrapper[4870]: I0130 08:33:03.131689 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 08:33:03 crc kubenswrapper[4870]: I0130 08:33:03.325491 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 08:33:03 crc kubenswrapper[4870]: W0130 08:33:03.327790 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2575ea2c_dc22_4ca2_bf0b_d67eaa330832.slice/crio-a1c0409ce24f2df08527052ff97af7fd0efe806a48a4ecd7876b51529ce24023 WatchSource:0}: Error finding container a1c0409ce24f2df08527052ff97af7fd0efe806a48a4ecd7876b51529ce24023: Status 404 returned error can't find the container with id a1c0409ce24f2df08527052ff97af7fd0efe806a48a4ecd7876b51529ce24023 Jan 30 08:33:04 crc kubenswrapper[4870]: I0130 08:33:04.085060 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee2f8ed-b030-40fc-90fb-32a4e404b1fd" path="/var/lib/kubelet/pods/0ee2f8ed-b030-40fc-90fb-32a4e404b1fd/volumes" Jan 30 08:33:04 crc kubenswrapper[4870]: I0130 08:33:04.085927 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f21b9d-25bf-4a64-94ef-51d83b662ab3" path="/var/lib/kubelet/pods/97f21b9d-25bf-4a64-94ef-51d83b662ab3/volumes" Jan 30 08:33:04 crc kubenswrapper[4870]: I0130 08:33:04.110392 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2575ea2c-dc22-4ca2-bf0b-d67eaa330832","Type":"ContainerStarted","Data":"a1c0409ce24f2df08527052ff97af7fd0efe806a48a4ecd7876b51529ce24023"} Jan 30 08:33:04 crc kubenswrapper[4870]: I0130 08:33:04.111464 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf05f72e-aa42-4296-a7dc-8b742d6e0aab","Type":"ContainerStarted","Data":"84957b48794c6259279a99608747a29ad7394fc9ca1bd6ce1f98a7f4c10a624e"} Jan 30 08:33:05 crc kubenswrapper[4870]: I0130 08:33:05.124279 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2575ea2c-dc22-4ca2-bf0b-d67eaa330832","Type":"ContainerStarted","Data":"b54d083e49539914abb80a09d56280e72de1f73b7b8543555a5595e346f4fb9e"} Jan 30 08:33:05 crc kubenswrapper[4870]: I0130 08:33:05.126458 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf05f72e-aa42-4296-a7dc-8b742d6e0aab","Type":"ContainerStarted","Data":"3624472f52ac7be1319516a8ee600eb767c9e0a446d907875f2e5857dd2b649f"} Jan 30 08:33:09 crc kubenswrapper[4870]: I0130 08:33:09.894607 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:33:09 crc kubenswrapper[4870]: I0130 08:33:09.945958 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:33:10 crc kubenswrapper[4870]: I0130 08:33:10.137347 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.173844 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d7xmh" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="registry-server" containerID="cri-o://d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8" gracePeriod=2 Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.649715 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.828007 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content\") pod \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.828416 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities\") pod \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.828546 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mqzj\" (UniqueName: \"kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj\") pod \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\" (UID: \"aeaabceb-b50c-48b6-b72d-d759f1bda8c1\") " Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.828989 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities" (OuterVolumeSpecName: "utilities") pod "aeaabceb-b50c-48b6-b72d-d759f1bda8c1" (UID: "aeaabceb-b50c-48b6-b72d-d759f1bda8c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.829502 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.838640 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj" (OuterVolumeSpecName: "kube-api-access-4mqzj") pod "aeaabceb-b50c-48b6-b72d-d759f1bda8c1" (UID: "aeaabceb-b50c-48b6-b72d-d759f1bda8c1"). InnerVolumeSpecName "kube-api-access-4mqzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.931332 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mqzj\" (UniqueName: \"kubernetes.io/projected/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-kube-api-access-4mqzj\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:11 crc kubenswrapper[4870]: I0130 08:33:11.945182 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeaabceb-b50c-48b6-b72d-d759f1bda8c1" (UID: "aeaabceb-b50c-48b6-b72d-d759f1bda8c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.033954 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaabceb-b50c-48b6-b72d-d759f1bda8c1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.187859 4870 generic.go:334] "Generic (PLEG): container finished" podID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerID="d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8" exitCode=0 Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.187948 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerDied","Data":"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8"} Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.187986 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7xmh" event={"ID":"aeaabceb-b50c-48b6-b72d-d759f1bda8c1","Type":"ContainerDied","Data":"ce7d2fa7b2f490ff295696455cb253c22c42342b96f4aac8e217dc481875d3d8"} Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.188013 4870 scope.go:117] "RemoveContainer" containerID="d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.190337 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7xmh" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.213619 4870 scope.go:117] "RemoveContainer" containerID="949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.221256 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.234081 4870 scope.go:117] "RemoveContainer" containerID="5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.234403 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d7xmh"] Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.293319 4870 scope.go:117] "RemoveContainer" containerID="d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8" Jan 30 08:33:12 crc kubenswrapper[4870]: E0130 08:33:12.293922 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8\": container with ID starting with d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8 not found: ID does not exist" containerID="d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.293974 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8"} err="failed to get container status \"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8\": rpc error: code = NotFound desc = could not find container \"d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8\": container with ID starting with d95aedd09a6225166721cd6f604fbff2dfd2107f2be52447765e651673cee8d8 not found: ID does not exist" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.294006 4870 scope.go:117] "RemoveContainer" containerID="949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff" Jan 30 08:33:12 crc kubenswrapper[4870]: E0130 08:33:12.294453 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff\": container with ID starting with 949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff not found: ID does not exist" containerID="949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.294664 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff"} err="failed to get container status \"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff\": rpc error: code = NotFound desc = could not find container \"949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff\": container with ID starting with 949d5f0f4a17100c16a04fa345ac82cf5d135826bf2bae1bad472102789c52ff not found: ID does not exist" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.294821 4870 scope.go:117] "RemoveContainer" containerID="5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b" Jan 30 08:33:12 crc kubenswrapper[4870]: E0130 08:33:12.295380 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b\": container with ID starting with 5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b not found: ID does not exist" containerID="5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b" Jan 30 08:33:12 crc kubenswrapper[4870]: I0130 08:33:12.295424 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b"} err="failed to get container status \"5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b\": rpc error: code = NotFound desc = could not find container \"5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b\": container with ID starting with 5f3286bcb78108dac7517ba9bb6ab1b643045f2decd36211166b9e846ca3555b not found: ID does not exist" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.404506 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:13 crc kubenswrapper[4870]: E0130 08:33:13.405346 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="extract-utilities" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.405363 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="extract-utilities" Jan 30 08:33:13 crc kubenswrapper[4870]: E0130 08:33:13.405383 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="extract-content" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.405392 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="extract-content" Jan 30 08:33:13 crc kubenswrapper[4870]: E0130 08:33:13.405404 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="registry-server" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.405411 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="registry-server" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.405702 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" containerName="registry-server" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.407344 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.408861 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.417491 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562607 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqp7f\" (UniqueName: \"kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562646 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562693 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562776 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562865 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.562946 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.563155 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665220 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665274 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665299 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665422 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqp7f\" (UniqueName: \"kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665442 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.665483 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.666298 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.666297 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.666619 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.666704 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.666997 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.667082 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.684442 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqp7f\" (UniqueName: \"kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f\") pod \"dnsmasq-dns-8545fb859-qvd2l\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:13 crc kubenswrapper[4870]: I0130 08:33:13.765207 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:14 crc kubenswrapper[4870]: I0130 08:33:14.086640 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeaabceb-b50c-48b6-b72d-d759f1bda8c1" path="/var/lib/kubelet/pods/aeaabceb-b50c-48b6-b72d-d759f1bda8c1/volumes" Jan 30 08:33:14 crc kubenswrapper[4870]: I0130 08:33:14.238356 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:15 crc kubenswrapper[4870]: I0130 08:33:15.224257 4870 generic.go:334] "Generic (PLEG): container finished" podID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerID="aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa" exitCode=0 Jan 30 08:33:15 crc kubenswrapper[4870]: I0130 08:33:15.224812 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" event={"ID":"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed","Type":"ContainerDied","Data":"aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa"} Jan 30 08:33:15 crc kubenswrapper[4870]: I0130 08:33:15.227715 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" event={"ID":"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed","Type":"ContainerStarted","Data":"7826d9b0a3ff0627f9617f1e48774768eeb63e9d54a8e156f621d77dbe1d82e2"} Jan 30 08:33:16 crc kubenswrapper[4870]: I0130 08:33:16.238017 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" event={"ID":"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed","Type":"ContainerStarted","Data":"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c"} Jan 30 08:33:16 crc kubenswrapper[4870]: I0130 08:33:16.238413 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:16 crc kubenswrapper[4870]: I0130 08:33:16.272970 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" podStartSLOduration=3.2729508689999998 podStartE2EDuration="3.272950869s" podCreationTimestamp="2026-01-30 08:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:33:16.264288867 +0000 UTC m=+1434.959835986" watchObservedRunningTime="2026-01-30 08:33:16.272950869 +0000 UTC m=+1434.968497978" Jan 30 08:33:23 crc kubenswrapper[4870]: I0130 08:33:23.766053 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:23 crc kubenswrapper[4870]: I0130 08:33:23.866184 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:33:23 crc kubenswrapper[4870]: I0130 08:33:23.866405 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6999845677-vd26g" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="dnsmasq-dns" containerID="cri-o://478f243e0d74f6dbf93b850491c64bcf3ea2a501bedee4e704be06e6e754b799" gracePeriod=10 Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.005952 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66968b76ff-bk2j6"] Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.008355 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.014453 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66968b76ff-bk2j6"] Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180647 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-svc\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180695 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-nb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180724 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-sb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180772 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-swift-storage-0\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180795 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5jb\" (UniqueName: \"kubernetes.io/projected/3f90c906-9b1e-4df6-8b94-367ae01963b7-kube-api-access-2q5jb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180856 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.180975 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-config\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.282707 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-config\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.282914 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-svc\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.282943 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-nb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.282969 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-sb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.283017 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5jb\" (UniqueName: \"kubernetes.io/projected/3f90c906-9b1e-4df6-8b94-367ae01963b7-kube-api-access-2q5jb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.283038 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-swift-storage-0\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.283087 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.284121 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-openstack-edpm-ipam\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.284174 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-swift-storage-0\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.284220 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-nb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.284317 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-ovsdbserver-sb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.284723 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-config\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.287125 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f90c906-9b1e-4df6-8b94-367ae01963b7-dns-svc\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.323464 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5jb\" (UniqueName: \"kubernetes.io/projected/3f90c906-9b1e-4df6-8b94-367ae01963b7-kube-api-access-2q5jb\") pod \"dnsmasq-dns-66968b76ff-bk2j6\" (UID: \"3f90c906-9b1e-4df6-8b94-367ae01963b7\") " pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.334178 4870 generic.go:334] "Generic (PLEG): container finished" podID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerID="478f243e0d74f6dbf93b850491c64bcf3ea2a501bedee4e704be06e6e754b799" exitCode=0 Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.334231 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999845677-vd26g" event={"ID":"8a0f9be1-926a-4340-9f05-ba673e3e471e","Type":"ContainerDied","Data":"478f243e0d74f6dbf93b850491c64bcf3ea2a501bedee4e704be06e6e754b799"} Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.368890 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.473112 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.591753 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.599063 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.599258 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2gv2\" (UniqueName: \"kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.599320 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.599354 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.599802 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb\") pod \"8a0f9be1-926a-4340-9f05-ba673e3e471e\" (UID: \"8a0f9be1-926a-4340-9f05-ba673e3e471e\") " Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.617226 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2" (OuterVolumeSpecName: "kube-api-access-c2gv2") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "kube-api-access-c2gv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.669963 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config" (OuterVolumeSpecName: "config") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.681229 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.682306 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.685387 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.703192 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.703933 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.703946 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2gv2\" (UniqueName: \"kubernetes.io/projected/8a0f9be1-926a-4340-9f05-ba673e3e471e-kube-api-access-c2gv2\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.703962 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.703972 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.704093 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a0f9be1-926a-4340-9f05-ba673e3e471e" (UID: "8a0f9be1-926a-4340-9f05-ba673e3e471e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.806777 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a0f9be1-926a-4340-9f05-ba673e3e471e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:24 crc kubenswrapper[4870]: I0130 08:33:24.995435 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66968b76ff-bk2j6"] Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.345841 4870 generic.go:334] "Generic (PLEG): container finished" podID="3f90c906-9b1e-4df6-8b94-367ae01963b7" containerID="0addfc6a8224617c6f63521fe23229a28227d9e93d1629ec2c7dad7e45a59cf9" exitCode=0 Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.345909 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" event={"ID":"3f90c906-9b1e-4df6-8b94-367ae01963b7","Type":"ContainerDied","Data":"0addfc6a8224617c6f63521fe23229a28227d9e93d1629ec2c7dad7e45a59cf9"} Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.346393 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" event={"ID":"3f90c906-9b1e-4df6-8b94-367ae01963b7","Type":"ContainerStarted","Data":"4cf2d7c4e57e6581d6da7b31d6f1b60b35039aa0e3312141c04d27d29fd51192"} Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.349954 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6999845677-vd26g" event={"ID":"8a0f9be1-926a-4340-9f05-ba673e3e471e","Type":"ContainerDied","Data":"b68353bb62e0c010f98866001c94c9fbd25b787dc4392f6cd92563052dd236dc"} Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.350002 4870 scope.go:117] "RemoveContainer" containerID="478f243e0d74f6dbf93b850491c64bcf3ea2a501bedee4e704be06e6e754b799" Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.350042 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6999845677-vd26g" Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.435863 4870 scope.go:117] "RemoveContainer" containerID="cf4692acee92608a7992da7d8327f9e59bf6302ddd00cf4b1c51b56d002d56e2" Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.450355 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:33:25 crc kubenswrapper[4870]: I0130 08:33:25.460924 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6999845677-vd26g"] Jan 30 08:33:26 crc kubenswrapper[4870]: I0130 08:33:26.088286 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" path="/var/lib/kubelet/pods/8a0f9be1-926a-4340-9f05-ba673e3e471e/volumes" Jan 30 08:33:26 crc kubenswrapper[4870]: I0130 08:33:26.358865 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" event={"ID":"3f90c906-9b1e-4df6-8b94-367ae01963b7","Type":"ContainerStarted","Data":"4b04f926d24d905b140ac17350d4e04e61cb1e7defd63fecb73a38e721dc978f"} Jan 30 08:33:26 crc kubenswrapper[4870]: I0130 08:33:26.359023 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:26 crc kubenswrapper[4870]: I0130 08:33:26.377272 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" podStartSLOduration=3.377254902 podStartE2EDuration="3.377254902s" podCreationTimestamp="2026-01-30 08:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:33:26.373747393 +0000 UTC m=+1445.069294512" watchObservedRunningTime="2026-01-30 08:33:26.377254902 +0000 UTC m=+1445.072802011" Jan 30 08:33:34 crc kubenswrapper[4870]: I0130 08:33:34.371128 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66968b76ff-bk2j6" Jan 30 08:33:34 crc kubenswrapper[4870]: I0130 08:33:34.437023 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:34 crc kubenswrapper[4870]: I0130 08:33:34.437901 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="dnsmasq-dns" containerID="cri-o://48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c" gracePeriod=10 Jan 30 08:33:34 crc kubenswrapper[4870]: I0130 08:33:34.910131 4870 scope.go:117] "RemoveContainer" containerID="7a7adac6f43dd00107198ca12f07a56a507d2b37982cb644a01747e8eb0b5b52" Jan 30 08:33:34 crc kubenswrapper[4870]: I0130 08:33:34.957130 4870 scope.go:117] "RemoveContainer" containerID="cd0dfeab70fb307cbb6535bdd2b5daa2556dc7c49a1bf88e90112f1cde7b135d" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.049718 4870 scope.go:117] "RemoveContainer" containerID="d52e878ce9dd90e8dba444ebd6a2071ac79735b92b1f1220889d88eefcb18bc4" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.092695 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208639 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208719 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208825 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208900 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208946 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208963 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqp7f\" (UniqueName: \"kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.208984 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb\") pod \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\" (UID: \"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed\") " Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.213824 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f" (OuterVolumeSpecName: "kube-api-access-xqp7f") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "kube-api-access-xqp7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.261492 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.263744 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.270849 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.271271 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.276491 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.284048 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config" (OuterVolumeSpecName: "config") pod "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" (UID: "06541376-f5b8-4c6d-bd52-cf6b2d30e2ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311352 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311384 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311395 4870 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311405 4870 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311413 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311422 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqp7f\" (UniqueName: \"kubernetes.io/projected/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-kube-api-access-xqp7f\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.311431 4870 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.460104 4870 generic.go:334] "Generic (PLEG): container finished" podID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerID="48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c" exitCode=0 Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.460138 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.460188 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" event={"ID":"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed","Type":"ContainerDied","Data":"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c"} Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.460224 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8545fb859-qvd2l" event={"ID":"06541376-f5b8-4c6d-bd52-cf6b2d30e2ed","Type":"ContainerDied","Data":"7826d9b0a3ff0627f9617f1e48774768eeb63e9d54a8e156f621d77dbe1d82e2"} Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.460246 4870 scope.go:117] "RemoveContainer" containerID="48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.482563 4870 scope.go:117] "RemoveContainer" containerID="aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.490429 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.500943 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8545fb859-qvd2l"] Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.509292 4870 scope.go:117] "RemoveContainer" containerID="48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c" Jan 30 08:33:35 crc kubenswrapper[4870]: E0130 08:33:35.510412 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c\": container with ID starting with 48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c not found: ID does not exist" containerID="48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.510492 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c"} err="failed to get container status \"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c\": rpc error: code = NotFound desc = could not find container \"48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c\": container with ID starting with 48d93727034361b344ad973bd73104fc4515555aaf6f73c02cf9dc34de44151c not found: ID does not exist" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.510583 4870 scope.go:117] "RemoveContainer" containerID="aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa" Jan 30 08:33:35 crc kubenswrapper[4870]: E0130 08:33:35.511035 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa\": container with ID starting with aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa not found: ID does not exist" containerID="aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa" Jan 30 08:33:35 crc kubenswrapper[4870]: I0130 08:33:35.511104 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa"} err="failed to get container status \"aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa\": rpc error: code = NotFound desc = could not find container \"aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa\": container with ID starting with aa7fda4abf155c21a53cf2f98e31d5c8a03cb75e77e5b68e71924e2b6539bcaa not found: ID does not exist" Jan 30 08:33:36 crc kubenswrapper[4870]: I0130 08:33:36.086024 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" path="/var/lib/kubelet/pods/06541376-f5b8-4c6d-bd52-cf6b2d30e2ed/volumes" Jan 30 08:33:37 crc kubenswrapper[4870]: I0130 08:33:37.484300 4870 generic.go:334] "Generic (PLEG): container finished" podID="2575ea2c-dc22-4ca2-bf0b-d67eaa330832" containerID="b54d083e49539914abb80a09d56280e72de1f73b7b8543555a5595e346f4fb9e" exitCode=0 Jan 30 08:33:37 crc kubenswrapper[4870]: I0130 08:33:37.484413 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2575ea2c-dc22-4ca2-bf0b-d67eaa330832","Type":"ContainerDied","Data":"b54d083e49539914abb80a09d56280e72de1f73b7b8543555a5595e346f4fb9e"} Jan 30 08:33:37 crc kubenswrapper[4870]: I0130 08:33:37.487720 4870 generic.go:334] "Generic (PLEG): container finished" podID="bf05f72e-aa42-4296-a7dc-8b742d6e0aab" containerID="3624472f52ac7be1319516a8ee600eb767c9e0a446d907875f2e5857dd2b649f" exitCode=0 Jan 30 08:33:37 crc kubenswrapper[4870]: I0130 08:33:37.487763 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf05f72e-aa42-4296-a7dc-8b742d6e0aab","Type":"ContainerDied","Data":"3624472f52ac7be1319516a8ee600eb767c9e0a446d907875f2e5857dd2b649f"} Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.498786 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf05f72e-aa42-4296-a7dc-8b742d6e0aab","Type":"ContainerStarted","Data":"db147700e9462fac8000f8f140a1d336d90dd98b395146a598c0eb481a3983a5"} Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.499404 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.501938 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2575ea2c-dc22-4ca2-bf0b-d67eaa330832","Type":"ContainerStarted","Data":"f76a1a1abe0c8909c0ecbc74f8237bab4820a439c1adbd52cfdc4e24d255c330"} Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.502191 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.531002 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.530983133 podStartE2EDuration="36.530983133s" podCreationTimestamp="2026-01-30 08:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:33:38.52516893 +0000 UTC m=+1457.220716069" watchObservedRunningTime="2026-01-30 08:33:38.530983133 +0000 UTC m=+1457.226530262" Jan 30 08:33:38 crc kubenswrapper[4870]: I0130 08:33:38.555702 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.555683787 podStartE2EDuration="36.555683787s" podCreationTimestamp="2026-01-30 08:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:33:38.551487356 +0000 UTC m=+1457.247034465" watchObservedRunningTime="2026-01-30 08:33:38.555683787 +0000 UTC m=+1457.251230906" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.842293 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm"] Jan 30 08:33:46 crc kubenswrapper[4870]: E0130 08:33:46.843660 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="init" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.843679 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="init" Jan 30 08:33:46 crc kubenswrapper[4870]: E0130 08:33:46.843703 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.843712 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: E0130 08:33:46.843729 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.843737 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: E0130 08:33:46.843769 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="init" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.843780 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="init" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.844059 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0f9be1-926a-4340-9f05-ba673e3e471e" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.844095 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="06541376-f5b8-4c6d-bd52-cf6b2d30e2ed" containerName="dnsmasq-dns" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.844998 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.850210 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.854619 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.855044 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.855600 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.875136 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm"] Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.951113 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.951516 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.951768 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:46 crc kubenswrapper[4870]: I0130 08:33:46.951833 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5sg\" (UniqueName: \"kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.053530 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.053633 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.053664 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.053691 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5sg\" (UniqueName: \"kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.059205 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.061327 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.069772 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.070507 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5sg\" (UniqueName: \"kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.188701 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:33:47 crc kubenswrapper[4870]: W0130 08:33:47.907475 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68089c9f_f566_4e65_b2ea_dd65a4d9012c.slice/crio-8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6 WatchSource:0}: Error finding container 8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6: Status 404 returned error can't find the container with id 8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6 Jan 30 08:33:47 crc kubenswrapper[4870]: I0130 08:33:47.911963 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm"] Jan 30 08:33:48 crc kubenswrapper[4870]: I0130 08:33:48.596992 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" event={"ID":"68089c9f-f566-4e65-b2ea-dd65a4d9012c","Type":"ContainerStarted","Data":"8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6"} Jan 30 08:33:52 crc kubenswrapper[4870]: I0130 08:33:52.663017 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bf05f72e-aa42-4296-a7dc-8b742d6e0aab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.231:5671: connect: connection refused" Jan 30 08:33:52 crc kubenswrapper[4870]: I0130 08:33:52.782156 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 08:34:00 crc kubenswrapper[4870]: I0130 08:34:00.715446 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" event={"ID":"68089c9f-f566-4e65-b2ea-dd65a4d9012c","Type":"ContainerStarted","Data":"229d0b4cb2361e681eec79909ce30ca976b76a97bc610f98c8041877c395c51a"} Jan 30 08:34:00 crc kubenswrapper[4870]: I0130 08:34:00.745401 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" podStartSLOduration=2.765465949 podStartE2EDuration="14.745372799s" podCreationTimestamp="2026-01-30 08:33:46 +0000 UTC" firstStartedPulling="2026-01-30 08:33:47.915165955 +0000 UTC m=+1466.610713054" lastFinishedPulling="2026-01-30 08:33:59.895072795 +0000 UTC m=+1478.590619904" observedRunningTime="2026-01-30 08:34:00.731585116 +0000 UTC m=+1479.427132245" watchObservedRunningTime="2026-01-30 08:34:00.745372799 +0000 UTC m=+1479.440919928" Jan 30 08:34:02 crc kubenswrapper[4870]: I0130 08:34:02.661899 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 08:34:12 crc kubenswrapper[4870]: I0130 08:34:12.862905 4870 generic.go:334] "Generic (PLEG): container finished" podID="68089c9f-f566-4e65-b2ea-dd65a4d9012c" containerID="229d0b4cb2361e681eec79909ce30ca976b76a97bc610f98c8041877c395c51a" exitCode=0 Jan 30 08:34:12 crc kubenswrapper[4870]: I0130 08:34:12.863066 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" event={"ID":"68089c9f-f566-4e65-b2ea-dd65a4d9012c","Type":"ContainerDied","Data":"229d0b4cb2361e681eec79909ce30ca976b76a97bc610f98c8041877c395c51a"} Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.290962 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.438614 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq5sg\" (UniqueName: \"kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg\") pod \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.438798 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle\") pod \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.438841 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam\") pod \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.438928 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory\") pod \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\" (UID: \"68089c9f-f566-4e65-b2ea-dd65a4d9012c\") " Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.445838 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg" (OuterVolumeSpecName: "kube-api-access-sq5sg") pod "68089c9f-f566-4e65-b2ea-dd65a4d9012c" (UID: "68089c9f-f566-4e65-b2ea-dd65a4d9012c"). InnerVolumeSpecName "kube-api-access-sq5sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.446585 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "68089c9f-f566-4e65-b2ea-dd65a4d9012c" (UID: "68089c9f-f566-4e65-b2ea-dd65a4d9012c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.468775 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory" (OuterVolumeSpecName: "inventory") pod "68089c9f-f566-4e65-b2ea-dd65a4d9012c" (UID: "68089c9f-f566-4e65-b2ea-dd65a4d9012c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.491038 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "68089c9f-f566-4e65-b2ea-dd65a4d9012c" (UID: "68089c9f-f566-4e65-b2ea-dd65a4d9012c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.543705 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq5sg\" (UniqueName: \"kubernetes.io/projected/68089c9f-f566-4e65-b2ea-dd65a4d9012c-kube-api-access-sq5sg\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.543766 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.543785 4870 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.543805 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68089c9f-f566-4e65-b2ea-dd65a4d9012c-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.893625 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" event={"ID":"68089c9f-f566-4e65-b2ea-dd65a4d9012c","Type":"ContainerDied","Data":"8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6"} Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.894033 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a1331f6e7eb347ff4b3857fd9acf81ba2e56b9d245819fa7e1216f5d37fa0b6" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.893677 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.988306 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48"] Jan 30 08:34:14 crc kubenswrapper[4870]: E0130 08:34:14.988738 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68089c9f-f566-4e65-b2ea-dd65a4d9012c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.988758 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="68089c9f-f566-4e65-b2ea-dd65a4d9012c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.989061 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="68089c9f-f566-4e65-b2ea-dd65a4d9012c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.989841 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.992124 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.992182 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.992701 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.993151 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:34:14 crc kubenswrapper[4870]: I0130 08:34:14.999747 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48"] Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.156209 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fkhs\" (UniqueName: \"kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.156507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.156561 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.259506 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fkhs\" (UniqueName: \"kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.259633 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.259690 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.266634 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.266946 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.288398 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fkhs\" (UniqueName: \"kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fpd48\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.350145 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.886143 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48"] Jan 30 08:34:15 crc kubenswrapper[4870]: I0130 08:34:15.904658 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" event={"ID":"c22cad0f-b909-42fa-95c5-2536e1105161","Type":"ContainerStarted","Data":"487f115d8a470a1ae56a3e31ff4d51daf55d472d8271d7644a3fc821dc96f45b"} Jan 30 08:34:16 crc kubenswrapper[4870]: I0130 08:34:16.918463 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" event={"ID":"c22cad0f-b909-42fa-95c5-2536e1105161","Type":"ContainerStarted","Data":"d9ef039ab8c2e568b2b52b85afceb2fe4365ed15ae540b3a7b48e6bc4512bd56"} Jan 30 08:34:16 crc kubenswrapper[4870]: I0130 08:34:16.936649 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" podStartSLOduration=2.5014558730000003 podStartE2EDuration="2.9366318s" podCreationTimestamp="2026-01-30 08:34:14 +0000 UTC" firstStartedPulling="2026-01-30 08:34:15.888312026 +0000 UTC m=+1494.583859135" lastFinishedPulling="2026-01-30 08:34:16.323487953 +0000 UTC m=+1495.019035062" observedRunningTime="2026-01-30 08:34:16.931823348 +0000 UTC m=+1495.627370457" watchObservedRunningTime="2026-01-30 08:34:16.9366318 +0000 UTC m=+1495.632178909" Jan 30 08:34:19 crc kubenswrapper[4870]: I0130 08:34:19.952702 4870 generic.go:334] "Generic (PLEG): container finished" podID="c22cad0f-b909-42fa-95c5-2536e1105161" containerID="d9ef039ab8c2e568b2b52b85afceb2fe4365ed15ae540b3a7b48e6bc4512bd56" exitCode=0 Jan 30 08:34:19 crc kubenswrapper[4870]: I0130 08:34:19.952791 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" event={"ID":"c22cad0f-b909-42fa-95c5-2536e1105161","Type":"ContainerDied","Data":"d9ef039ab8c2e568b2b52b85afceb2fe4365ed15ae540b3a7b48e6bc4512bd56"} Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.444765 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.594013 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam\") pod \"c22cad0f-b909-42fa-95c5-2536e1105161\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.594111 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory\") pod \"c22cad0f-b909-42fa-95c5-2536e1105161\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.594357 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fkhs\" (UniqueName: \"kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs\") pod \"c22cad0f-b909-42fa-95c5-2536e1105161\" (UID: \"c22cad0f-b909-42fa-95c5-2536e1105161\") " Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.598929 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs" (OuterVolumeSpecName: "kube-api-access-4fkhs") pod "c22cad0f-b909-42fa-95c5-2536e1105161" (UID: "c22cad0f-b909-42fa-95c5-2536e1105161"). InnerVolumeSpecName "kube-api-access-4fkhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.621496 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c22cad0f-b909-42fa-95c5-2536e1105161" (UID: "c22cad0f-b909-42fa-95c5-2536e1105161"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.621938 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory" (OuterVolumeSpecName: "inventory") pod "c22cad0f-b909-42fa-95c5-2536e1105161" (UID: "c22cad0f-b909-42fa-95c5-2536e1105161"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.696918 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.696956 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c22cad0f-b909-42fa-95c5-2536e1105161-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.696972 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fkhs\" (UniqueName: \"kubernetes.io/projected/c22cad0f-b909-42fa-95c5-2536e1105161-kube-api-access-4fkhs\") on node \"crc\" DevicePath \"\"" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.976164 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" event={"ID":"c22cad0f-b909-42fa-95c5-2536e1105161","Type":"ContainerDied","Data":"487f115d8a470a1ae56a3e31ff4d51daf55d472d8271d7644a3fc821dc96f45b"} Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.976215 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487f115d8a470a1ae56a3e31ff4d51daf55d472d8271d7644a3fc821dc96f45b" Jan 30 08:34:21 crc kubenswrapper[4870]: I0130 08:34:21.976215 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fpd48" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.052982 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c"] Jan 30 08:34:22 crc kubenswrapper[4870]: E0130 08:34:22.053523 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22cad0f-b909-42fa-95c5-2536e1105161" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.053545 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22cad0f-b909-42fa-95c5-2536e1105161" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.053821 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22cad0f-b909-42fa-95c5-2536e1105161" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.054920 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.059376 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.059498 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.059622 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.059782 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.069268 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c"] Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.213048 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.213177 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-725dm\" (UniqueName: \"kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.213257 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.213325 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.315005 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-725dm\" (UniqueName: \"kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.315336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.315472 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.315564 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.322229 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.322230 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.329048 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.349182 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-725dm\" (UniqueName: \"kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.384896 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:34:22 crc kubenswrapper[4870]: I0130 08:34:22.968787 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c"] Jan 30 08:34:23 crc kubenswrapper[4870]: I0130 08:34:23.001501 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" event={"ID":"620aba2c-f389-4fc9-a27c-28c937894f7d","Type":"ContainerStarted","Data":"5bf4e90c0f45b9a3ea6282a3aabea041683d9fdd400c0367a4321598c264f32e"} Jan 30 08:34:25 crc kubenswrapper[4870]: I0130 08:34:25.249570 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:34:25 crc kubenswrapper[4870]: I0130 08:34:25.250139 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:34:26 crc kubenswrapper[4870]: I0130 08:34:26.029138 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" event={"ID":"620aba2c-f389-4fc9-a27c-28c937894f7d","Type":"ContainerStarted","Data":"9059f589d52c73a92e30dc6901a5f313013722880c7f17d79aebc9067dcd7fa9"} Jan 30 08:34:26 crc kubenswrapper[4870]: I0130 08:34:26.047458 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" podStartSLOduration=1.307072706 podStartE2EDuration="4.047440419s" podCreationTimestamp="2026-01-30 08:34:22 +0000 UTC" firstStartedPulling="2026-01-30 08:34:22.975006613 +0000 UTC m=+1501.670553762" lastFinishedPulling="2026-01-30 08:34:25.715374346 +0000 UTC m=+1504.410921475" observedRunningTime="2026-01-30 08:34:26.046209211 +0000 UTC m=+1504.741756330" watchObservedRunningTime="2026-01-30 08:34:26.047440419 +0000 UTC m=+1504.742987538" Jan 30 08:34:35 crc kubenswrapper[4870]: I0130 08:34:35.250879 4870 scope.go:117] "RemoveContainer" containerID="3cc3794f576037b7275283832f1fcd12d44b3421b4fb40fee74fe7e2b82882e4" Jan 30 08:34:35 crc kubenswrapper[4870]: I0130 08:34:35.287622 4870 scope.go:117] "RemoveContainer" containerID="5ab755a04f7323eeb1d2f9974fb01380d8fa7fa3cba2f17b2589a502cb34ef24" Jan 30 08:34:55 crc kubenswrapper[4870]: I0130 08:34:55.249735 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:34:55 crc kubenswrapper[4870]: I0130 08:34:55.250440 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.250177 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.250858 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.250945 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.251905 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.251981 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" gracePeriod=600 Jan 30 08:35:25 crc kubenswrapper[4870]: E0130 08:35:25.376053 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.705944 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" exitCode=0 Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.706027 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49"} Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.706100 4870 scope.go:117] "RemoveContainer" containerID="fed0dc1b3541c4793a049fc7617c3773e2d05f0ebfb934d3acc5ededede3b844" Jan 30 08:35:25 crc kubenswrapper[4870]: I0130 08:35:25.707446 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:35:25 crc kubenswrapper[4870]: E0130 08:35:25.707911 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:35:35 crc kubenswrapper[4870]: I0130 08:35:35.385049 4870 scope.go:117] "RemoveContainer" containerID="8e106b0c6b2ed513250f13c043895b69dbe1cd77d36b5ecd4e47e2f2226b112e" Jan 30 08:35:35 crc kubenswrapper[4870]: I0130 08:35:35.418213 4870 scope.go:117] "RemoveContainer" containerID="b1d3aae9bf64c7d5adb6c7a0c0cef4cbd05ceee79bf2bdc26b1676e0ef8ac7ff" Jan 30 08:35:36 crc kubenswrapper[4870]: I0130 08:35:36.074755 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:35:36 crc kubenswrapper[4870]: E0130 08:35:36.075126 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.075134 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:35:51 crc kubenswrapper[4870]: E0130 08:35:51.077571 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.208868 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.216547 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.236335 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.311974 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.312051 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvhrq\" (UniqueName: \"kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.312154 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.415384 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.415445 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvhrq\" (UniqueName: \"kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.415505 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.415966 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.416114 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.461909 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvhrq\" (UniqueName: \"kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq\") pod \"community-operators-nqdjn\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:51 crc kubenswrapper[4870]: I0130 08:35:51.540624 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:35:52 crc kubenswrapper[4870]: I0130 08:35:52.051378 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:35:53 crc kubenswrapper[4870]: I0130 08:35:53.015958 4870 generic.go:334] "Generic (PLEG): container finished" podID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerID="5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0" exitCode=0 Jan 30 08:35:53 crc kubenswrapper[4870]: I0130 08:35:53.016048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerDied","Data":"5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0"} Jan 30 08:35:53 crc kubenswrapper[4870]: I0130 08:35:53.016261 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerStarted","Data":"9d39a058afc74a56bfb3df796b94ad3e8258ca3af82e61dd9d6b307c12ee9260"} Jan 30 08:35:53 crc kubenswrapper[4870]: I0130 08:35:53.021868 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:35:54 crc kubenswrapper[4870]: I0130 08:35:54.028441 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerStarted","Data":"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839"} Jan 30 08:35:55 crc kubenswrapper[4870]: I0130 08:35:55.041262 4870 generic.go:334] "Generic (PLEG): container finished" podID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerID="346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839" exitCode=0 Jan 30 08:35:55 crc kubenswrapper[4870]: I0130 08:35:55.041331 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerDied","Data":"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839"} Jan 30 08:35:56 crc kubenswrapper[4870]: I0130 08:35:56.053581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerStarted","Data":"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13"} Jan 30 08:35:56 crc kubenswrapper[4870]: I0130 08:35:56.079255 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqdjn" podStartSLOduration=2.64546292 podStartE2EDuration="5.079235179s" podCreationTimestamp="2026-01-30 08:35:51 +0000 UTC" firstStartedPulling="2026-01-30 08:35:53.021632497 +0000 UTC m=+1591.717179596" lastFinishedPulling="2026-01-30 08:35:55.455404746 +0000 UTC m=+1594.150951855" observedRunningTime="2026-01-30 08:35:56.069691873 +0000 UTC m=+1594.765239012" watchObservedRunningTime="2026-01-30 08:35:56.079235179 +0000 UTC m=+1594.774782298" Jan 30 08:36:01 crc kubenswrapper[4870]: I0130 08:36:01.540986 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:01 crc kubenswrapper[4870]: I0130 08:36:01.541621 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:01 crc kubenswrapper[4870]: I0130 08:36:01.607069 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:02 crc kubenswrapper[4870]: I0130 08:36:02.081565 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:36:02 crc kubenswrapper[4870]: E0130 08:36:02.082084 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:36:02 crc kubenswrapper[4870]: I0130 08:36:02.171748 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:02 crc kubenswrapper[4870]: I0130 08:36:02.243774 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.145675 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqdjn" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="registry-server" containerID="cri-o://c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13" gracePeriod=2 Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.622463 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.706380 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities\") pod \"e0a93ca5-633c-4649-b23a-38f6ad85457c\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.706500 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content\") pod \"e0a93ca5-633c-4649-b23a-38f6ad85457c\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.706593 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvhrq\" (UniqueName: \"kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq\") pod \"e0a93ca5-633c-4649-b23a-38f6ad85457c\" (UID: \"e0a93ca5-633c-4649-b23a-38f6ad85457c\") " Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.708436 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities" (OuterVolumeSpecName: "utilities") pod "e0a93ca5-633c-4649-b23a-38f6ad85457c" (UID: "e0a93ca5-633c-4649-b23a-38f6ad85457c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.712084 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq" (OuterVolumeSpecName: "kube-api-access-dvhrq") pod "e0a93ca5-633c-4649-b23a-38f6ad85457c" (UID: "e0a93ca5-633c-4649-b23a-38f6ad85457c"). InnerVolumeSpecName "kube-api-access-dvhrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.752664 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0a93ca5-633c-4649-b23a-38f6ad85457c" (UID: "e0a93ca5-633c-4649-b23a-38f6ad85457c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.809399 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvhrq\" (UniqueName: \"kubernetes.io/projected/e0a93ca5-633c-4649-b23a-38f6ad85457c-kube-api-access-dvhrq\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.809438 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:04 crc kubenswrapper[4870]: I0130 08:36:04.809450 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0a93ca5-633c-4649-b23a-38f6ad85457c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.157774 4870 generic.go:334] "Generic (PLEG): container finished" podID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerID="c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13" exitCode=0 Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.157859 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqdjn" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.157947 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerDied","Data":"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13"} Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.158293 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqdjn" event={"ID":"e0a93ca5-633c-4649-b23a-38f6ad85457c","Type":"ContainerDied","Data":"9d39a058afc74a56bfb3df796b94ad3e8258ca3af82e61dd9d6b307c12ee9260"} Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.158329 4870 scope.go:117] "RemoveContainer" containerID="c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.191476 4870 scope.go:117] "RemoveContainer" containerID="346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.201046 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.212951 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqdjn"] Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.226279 4870 scope.go:117] "RemoveContainer" containerID="5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.289731 4870 scope.go:117] "RemoveContainer" containerID="c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13" Jan 30 08:36:05 crc kubenswrapper[4870]: E0130 08:36:05.290166 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13\": container with ID starting with c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13 not found: ID does not exist" containerID="c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.290205 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13"} err="failed to get container status \"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13\": rpc error: code = NotFound desc = could not find container \"c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13\": container with ID starting with c35189f24773e7e5586f4dab8b7ae4b1c1e1a8d3835f49323c3e68c922681e13 not found: ID does not exist" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.290230 4870 scope.go:117] "RemoveContainer" containerID="346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839" Jan 30 08:36:05 crc kubenswrapper[4870]: E0130 08:36:05.290666 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839\": container with ID starting with 346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839 not found: ID does not exist" containerID="346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.290693 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839"} err="failed to get container status \"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839\": rpc error: code = NotFound desc = could not find container \"346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839\": container with ID starting with 346f08667613a1597067cc9e0b8f0190c7c39e39ac5f3b57bc30e1bd745cd839 not found: ID does not exist" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.290712 4870 scope.go:117] "RemoveContainer" containerID="5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0" Jan 30 08:36:05 crc kubenswrapper[4870]: E0130 08:36:05.291063 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0\": container with ID starting with 5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0 not found: ID does not exist" containerID="5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0" Jan 30 08:36:05 crc kubenswrapper[4870]: I0130 08:36:05.291087 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0"} err="failed to get container status \"5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0\": rpc error: code = NotFound desc = could not find container \"5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0\": container with ID starting with 5f1c509eeee8b7481f1c459ae79215514f74b744ea680d7d513e15629ff5f1d0 not found: ID does not exist" Jan 30 08:36:06 crc kubenswrapper[4870]: I0130 08:36:06.089843 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" path="/var/lib/kubelet/pods/e0a93ca5-633c-4649-b23a-38f6ad85457c/volumes" Jan 30 08:36:15 crc kubenswrapper[4870]: I0130 08:36:15.077637 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:36:15 crc kubenswrapper[4870]: E0130 08:36:15.078320 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:36:26 crc kubenswrapper[4870]: I0130 08:36:26.075490 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:36:26 crc kubenswrapper[4870]: E0130 08:36:26.076536 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.537241 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:30 crc kubenswrapper[4870]: E0130 08:36:30.538360 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="extract-content" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.538378 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="extract-content" Jan 30 08:36:30 crc kubenswrapper[4870]: E0130 08:36:30.538399 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="registry-server" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.539261 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="registry-server" Jan 30 08:36:30 crc kubenswrapper[4870]: E0130 08:36:30.539310 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="extract-utilities" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.539320 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="extract-utilities" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.539561 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a93ca5-633c-4649-b23a-38f6ad85457c" containerName="registry-server" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.541358 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.564112 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.576055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.576094 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9wgd\" (UniqueName: \"kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.576120 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.677955 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.678032 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wgd\" (UniqueName: \"kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.678076 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.678555 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.678602 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.710852 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9wgd\" (UniqueName: \"kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd\") pod \"certified-operators-vwzrt\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:30 crc kubenswrapper[4870]: I0130 08:36:30.874415 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:31 crc kubenswrapper[4870]: I0130 08:36:31.408701 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:31 crc kubenswrapper[4870]: W0130 08:36:31.420735 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a6d20bc_8755_40fc_a830_91f52584145f.slice/crio-6ed1107024871effb54a515b5ce8f15e0ec80ecb07f4b73d435b44ce270b6360 WatchSource:0}: Error finding container 6ed1107024871effb54a515b5ce8f15e0ec80ecb07f4b73d435b44ce270b6360: Status 404 returned error can't find the container with id 6ed1107024871effb54a515b5ce8f15e0ec80ecb07f4b73d435b44ce270b6360 Jan 30 08:36:32 crc kubenswrapper[4870]: I0130 08:36:32.445573 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a6d20bc-8755-40fc-a830-91f52584145f" containerID="4829db839cbadba4b50ba838a0028137cd5bae1a5e995ea792f030384fd924d0" exitCode=0 Jan 30 08:36:32 crc kubenswrapper[4870]: I0130 08:36:32.445742 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerDied","Data":"4829db839cbadba4b50ba838a0028137cd5bae1a5e995ea792f030384fd924d0"} Jan 30 08:36:32 crc kubenswrapper[4870]: I0130 08:36:32.446108 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerStarted","Data":"6ed1107024871effb54a515b5ce8f15e0ec80ecb07f4b73d435b44ce270b6360"} Jan 30 08:36:33 crc kubenswrapper[4870]: I0130 08:36:33.459680 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerStarted","Data":"ebba5706fbb7a13afe99283a09c87f3f21b1fffa0593913b311f354c43a2956e"} Jan 30 08:36:34 crc kubenswrapper[4870]: I0130 08:36:34.468519 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a6d20bc-8755-40fc-a830-91f52584145f" containerID="ebba5706fbb7a13afe99283a09c87f3f21b1fffa0593913b311f354c43a2956e" exitCode=0 Jan 30 08:36:34 crc kubenswrapper[4870]: I0130 08:36:34.468603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerDied","Data":"ebba5706fbb7a13afe99283a09c87f3f21b1fffa0593913b311f354c43a2956e"} Jan 30 08:36:35 crc kubenswrapper[4870]: I0130 08:36:35.538726 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerStarted","Data":"680e1b239b6bc11690a9df9dd3a6ce4a2ed253f413cc48929dead689ad8e2fb9"} Jan 30 08:36:35 crc kubenswrapper[4870]: I0130 08:36:35.567409 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwzrt" podStartSLOduration=3.081184265 podStartE2EDuration="5.567387982s" podCreationTimestamp="2026-01-30 08:36:30 +0000 UTC" firstStartedPulling="2026-01-30 08:36:32.450633323 +0000 UTC m=+1631.146180472" lastFinishedPulling="2026-01-30 08:36:34.93683708 +0000 UTC m=+1633.632384189" observedRunningTime="2026-01-30 08:36:35.557507895 +0000 UTC m=+1634.253055004" watchObservedRunningTime="2026-01-30 08:36:35.567387982 +0000 UTC m=+1634.262935091" Jan 30 08:36:38 crc kubenswrapper[4870]: I0130 08:36:38.074640 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:36:38 crc kubenswrapper[4870]: E0130 08:36:38.075478 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:36:40 crc kubenswrapper[4870]: I0130 08:36:40.875627 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:40 crc kubenswrapper[4870]: I0130 08:36:40.876056 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:40 crc kubenswrapper[4870]: I0130 08:36:40.933520 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:41 crc kubenswrapper[4870]: I0130 08:36:41.686175 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:41 crc kubenswrapper[4870]: I0130 08:36:41.731837 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:44 crc kubenswrapper[4870]: I0130 08:36:44.191422 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vwzrt" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="registry-server" containerID="cri-o://680e1b239b6bc11690a9df9dd3a6ce4a2ed253f413cc48929dead689ad8e2fb9" gracePeriod=2 Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.204843 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a6d20bc-8755-40fc-a830-91f52584145f" containerID="680e1b239b6bc11690a9df9dd3a6ce4a2ed253f413cc48929dead689ad8e2fb9" exitCode=0 Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.204902 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerDied","Data":"680e1b239b6bc11690a9df9dd3a6ce4a2ed253f413cc48929dead689ad8e2fb9"} Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.430938 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.549163 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9wgd\" (UniqueName: \"kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd\") pod \"6a6d20bc-8755-40fc-a830-91f52584145f\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.549231 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content\") pod \"6a6d20bc-8755-40fc-a830-91f52584145f\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.549282 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities\") pod \"6a6d20bc-8755-40fc-a830-91f52584145f\" (UID: \"6a6d20bc-8755-40fc-a830-91f52584145f\") " Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.550325 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities" (OuterVolumeSpecName: "utilities") pod "6a6d20bc-8755-40fc-a830-91f52584145f" (UID: "6a6d20bc-8755-40fc-a830-91f52584145f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.555526 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd" (OuterVolumeSpecName: "kube-api-access-x9wgd") pod "6a6d20bc-8755-40fc-a830-91f52584145f" (UID: "6a6d20bc-8755-40fc-a830-91f52584145f"). InnerVolumeSpecName "kube-api-access-x9wgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.604680 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a6d20bc-8755-40fc-a830-91f52584145f" (UID: "6a6d20bc-8755-40fc-a830-91f52584145f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.652337 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9wgd\" (UniqueName: \"kubernetes.io/projected/6a6d20bc-8755-40fc-a830-91f52584145f-kube-api-access-x9wgd\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.652396 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:45 crc kubenswrapper[4870]: I0130 08:36:45.652410 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6d20bc-8755-40fc-a830-91f52584145f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.218632 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwzrt" event={"ID":"6a6d20bc-8755-40fc-a830-91f52584145f","Type":"ContainerDied","Data":"6ed1107024871effb54a515b5ce8f15e0ec80ecb07f4b73d435b44ce270b6360"} Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.219378 4870 scope.go:117] "RemoveContainer" containerID="680e1b239b6bc11690a9df9dd3a6ce4a2ed253f413cc48929dead689ad8e2fb9" Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.218807 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwzrt" Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.248207 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.250855 4870 scope.go:117] "RemoveContainer" containerID="ebba5706fbb7a13afe99283a09c87f3f21b1fffa0593913b311f354c43a2956e" Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.261930 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vwzrt"] Jan 30 08:36:46 crc kubenswrapper[4870]: I0130 08:36:46.280519 4870 scope.go:117] "RemoveContainer" containerID="4829db839cbadba4b50ba838a0028137cd5bae1a5e995ea792f030384fd924d0" Jan 30 08:36:48 crc kubenswrapper[4870]: I0130 08:36:48.085357 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" path="/var/lib/kubelet/pods/6a6d20bc-8755-40fc-a830-91f52584145f/volumes" Jan 30 08:36:52 crc kubenswrapper[4870]: I0130 08:36:52.096022 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:36:52 crc kubenswrapper[4870]: E0130 08:36:52.096734 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:37:07 crc kubenswrapper[4870]: I0130 08:37:07.074769 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:37:07 crc kubenswrapper[4870]: E0130 08:37:07.075663 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.098802 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b24b-account-create-update-d2n4p"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.109713 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-be9b-account-create-update-lgqm6"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.122671 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b24b-account-create-update-d2n4p"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.136413 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-be9b-account-create-update-lgqm6"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.156511 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-772bw"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.164406 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zdg4s"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.172482 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zdg4s"] Jan 30 08:37:13 crc kubenswrapper[4870]: I0130 08:37:13.184073 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-772bw"] Jan 30 08:37:14 crc kubenswrapper[4870]: I0130 08:37:14.097565 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b66abfb-27d1-415e-abf2-2cb855a2bcaf" path="/var/lib/kubelet/pods/3b66abfb-27d1-415e-abf2-2cb855a2bcaf/volumes" Jan 30 08:37:14 crc kubenswrapper[4870]: I0130 08:37:14.098498 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac3a52d-4734-4be8-9530-6b7b535664f8" path="/var/lib/kubelet/pods/5ac3a52d-4734-4be8-9530-6b7b535664f8/volumes" Jan 30 08:37:14 crc kubenswrapper[4870]: I0130 08:37:14.099558 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93cd49cf-8353-49eb-89d2-2d3630503d9f" path="/var/lib/kubelet/pods/93cd49cf-8353-49eb-89d2-2d3630503d9f/volumes" Jan 30 08:37:14 crc kubenswrapper[4870]: I0130 08:37:14.100365 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e990d4f-b684-47e6-8056-08cf765aa33d" path="/var/lib/kubelet/pods/9e990d4f-b684-47e6-8056-08cf765aa33d/volumes" Jan 30 08:37:15 crc kubenswrapper[4870]: I0130 08:37:15.034231 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-a8a4-account-create-update-8gm2f"] Jan 30 08:37:15 crc kubenswrapper[4870]: I0130 08:37:15.047224 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-a8a4-account-create-update-8gm2f"] Jan 30 08:37:15 crc kubenswrapper[4870]: I0130 08:37:15.056488 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-x6s7d"] Jan 30 08:37:15 crc kubenswrapper[4870]: I0130 08:37:15.064511 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-x6s7d"] Jan 30 08:37:16 crc kubenswrapper[4870]: I0130 08:37:16.091430 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3" path="/var/lib/kubelet/pods/40a93c77-4e48-4cd1-b89b-6a7de4d9c5e3/volumes" Jan 30 08:37:16 crc kubenswrapper[4870]: I0130 08:37:16.093078 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585a2047-d3db-4822-89b3-52fcd65d6e09" path="/var/lib/kubelet/pods/585a2047-d3db-4822-89b3-52fcd65d6e09/volumes" Jan 30 08:37:21 crc kubenswrapper[4870]: I0130 08:37:21.041151 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cpgc6"] Jan 30 08:37:21 crc kubenswrapper[4870]: I0130 08:37:21.051393 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cpgc6"] Jan 30 08:37:22 crc kubenswrapper[4870]: I0130 08:37:22.086752 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:37:22 crc kubenswrapper[4870]: E0130 08:37:22.087273 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:37:22 crc kubenswrapper[4870]: I0130 08:37:22.087428 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc3ddf0-5fc8-4425-a434-1452753e1297" path="/var/lib/kubelet/pods/8bc3ddf0-5fc8-4425-a434-1452753e1297/volumes" Jan 30 08:37:34 crc kubenswrapper[4870]: I0130 08:37:34.074914 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:37:34 crc kubenswrapper[4870]: E0130 08:37:34.075742 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.514993 4870 scope.go:117] "RemoveContainer" containerID="7e7325618d20bdeab54c732bc7a397cb58a9db4a697a599a002533f4811bf8bd" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.551804 4870 scope.go:117] "RemoveContainer" containerID="1e43a638833e8a28b17503377129992ef8df2c8dae8700c2567db5f0ab6b74f9" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.605676 4870 scope.go:117] "RemoveContainer" containerID="6dec8f4d9911b49219f94545d1dff11226dd491baa26e53a02289cf2ce287699" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.647146 4870 scope.go:117] "RemoveContainer" containerID="cf022953959b6108a335c22e59a92909beccc351b6ace66848278caae812affb" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.699327 4870 scope.go:117] "RemoveContainer" containerID="e32e8f9eb095a0767af1259a467ea84160f17bae2cb726e02486629d03a26d33" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.744204 4870 scope.go:117] "RemoveContainer" containerID="78530e29e6f33fe9e6244539f845bfc30d3752986bcfd2b607b62cc6f7d5aab3" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.822082 4870 scope.go:117] "RemoveContainer" containerID="5726ace895a9d7102cc621cf411a4327a47995798d8abdba29b293b762399c80" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.845288 4870 scope.go:117] "RemoveContainer" containerID="895f5e0a2008516657010356d30e83d3b79850fdf910e5ede0c0b5280b3040c2" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.865697 4870 scope.go:117] "RemoveContainer" containerID="ae557205b83ba573012321c0b15a5b47277e108dca93d5acd055965c34b03da8" Jan 30 08:37:35 crc kubenswrapper[4870]: I0130 08:37:35.884127 4870 scope.go:117] "RemoveContainer" containerID="fa09eeaef0e8d067370ba4e9a769247437b75f3bbc783ef72b9b39a713b37db0" Jan 30 08:37:37 crc kubenswrapper[4870]: I0130 08:37:37.791228 4870 generic.go:334] "Generic (PLEG): container finished" podID="620aba2c-f389-4fc9-a27c-28c937894f7d" containerID="9059f589d52c73a92e30dc6901a5f313013722880c7f17d79aebc9067dcd7fa9" exitCode=0 Jan 30 08:37:37 crc kubenswrapper[4870]: I0130 08:37:37.791340 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" event={"ID":"620aba2c-f389-4fc9-a27c-28c937894f7d","Type":"ContainerDied","Data":"9059f589d52c73a92e30dc6901a5f313013722880c7f17d79aebc9067dcd7fa9"} Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.323696 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.423135 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory\") pod \"620aba2c-f389-4fc9-a27c-28c937894f7d\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.423569 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle\") pod \"620aba2c-f389-4fc9-a27c-28c937894f7d\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.423721 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam\") pod \"620aba2c-f389-4fc9-a27c-28c937894f7d\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.423857 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-725dm\" (UniqueName: \"kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm\") pod \"620aba2c-f389-4fc9-a27c-28c937894f7d\" (UID: \"620aba2c-f389-4fc9-a27c-28c937894f7d\") " Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.430615 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm" (OuterVolumeSpecName: "kube-api-access-725dm") pod "620aba2c-f389-4fc9-a27c-28c937894f7d" (UID: "620aba2c-f389-4fc9-a27c-28c937894f7d"). InnerVolumeSpecName "kube-api-access-725dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.431140 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "620aba2c-f389-4fc9-a27c-28c937894f7d" (UID: "620aba2c-f389-4fc9-a27c-28c937894f7d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.454090 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "620aba2c-f389-4fc9-a27c-28c937894f7d" (UID: "620aba2c-f389-4fc9-a27c-28c937894f7d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.454473 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory" (OuterVolumeSpecName: "inventory") pod "620aba2c-f389-4fc9-a27c-28c937894f7d" (UID: "620aba2c-f389-4fc9-a27c-28c937894f7d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.526313 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-725dm\" (UniqueName: \"kubernetes.io/projected/620aba2c-f389-4fc9-a27c-28c937894f7d-kube-api-access-725dm\") on node \"crc\" DevicePath \"\"" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.526351 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.526361 4870 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.526370 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/620aba2c-f389-4fc9-a27c-28c937894f7d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.817038 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" event={"ID":"620aba2c-f389-4fc9-a27c-28c937894f7d","Type":"ContainerDied","Data":"5bf4e90c0f45b9a3ea6282a3aabea041683d9fdd400c0367a4321598c264f32e"} Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.817111 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf4e90c0f45b9a3ea6282a3aabea041683d9fdd400c0367a4321598c264f32e" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.817170 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.903955 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7"] Jan 30 08:37:39 crc kubenswrapper[4870]: E0130 08:37:39.904401 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="extract-utilities" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904418 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="extract-utilities" Jan 30 08:37:39 crc kubenswrapper[4870]: E0130 08:37:39.904439 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620aba2c-f389-4fc9-a27c-28c937894f7d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904447 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="620aba2c-f389-4fc9-a27c-28c937894f7d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 08:37:39 crc kubenswrapper[4870]: E0130 08:37:39.904466 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="extract-content" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904472 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="extract-content" Jan 30 08:37:39 crc kubenswrapper[4870]: E0130 08:37:39.904498 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="registry-server" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904504 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="registry-server" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904660 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6d20bc-8755-40fc-a830-91f52584145f" containerName="registry-server" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.904698 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="620aba2c-f389-4fc9-a27c-28c937894f7d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.905353 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.921958 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.922246 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.922338 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.922563 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:37:39 crc kubenswrapper[4870]: I0130 08:37:39.925096 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7"] Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.041984 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.042329 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqt9\" (UniqueName: \"kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.042437 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.042644 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kqrrr"] Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.051523 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6lzp5"] Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.060493 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kqrrr"] Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.069587 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6lzp5"] Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.088678 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d425622-da05-4988-a059-013c06b4ecf1" path="/var/lib/kubelet/pods/4d425622-da05-4988-a059-013c06b4ecf1/volumes" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.089492 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f46507-531f-4d06-86d9-6c07a50abc6d" path="/var/lib/kubelet/pods/59f46507-531f-4d06-86d9-6c07a50abc6d/volumes" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.144816 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqt9\" (UniqueName: \"kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.144905 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.145012 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.149056 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.153570 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.168604 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqt9\" (UniqueName: \"kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p67q7\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.246599 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:37:40 crc kubenswrapper[4870]: I0130 08:37:40.823400 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7"] Jan 30 08:37:41 crc kubenswrapper[4870]: I0130 08:37:41.844218 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" event={"ID":"9bef3cd3-94ab-486e-91de-c0ede57769d8","Type":"ContainerStarted","Data":"626d115ac205077a01bbe8f25312875b05af3e1a0b1ae6dc536bf8f8aea4f69b"} Jan 30 08:37:42 crc kubenswrapper[4870]: I0130 08:37:42.864511 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" event={"ID":"9bef3cd3-94ab-486e-91de-c0ede57769d8","Type":"ContainerStarted","Data":"36acfc5bf960402457c7f8cc9040b5a8f64a76024ba8309a1355304bfe83c1d9"} Jan 30 08:37:42 crc kubenswrapper[4870]: I0130 08:37:42.892161 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" podStartSLOduration=2.85257085 podStartE2EDuration="3.89213968s" podCreationTimestamp="2026-01-30 08:37:39 +0000 UTC" firstStartedPulling="2026-01-30 08:37:40.827052008 +0000 UTC m=+1699.522599157" lastFinishedPulling="2026-01-30 08:37:41.866620868 +0000 UTC m=+1700.562167987" observedRunningTime="2026-01-30 08:37:42.88928013 +0000 UTC m=+1701.584827249" watchObservedRunningTime="2026-01-30 08:37:42.89213968 +0000 UTC m=+1701.587686789" Jan 30 08:37:46 crc kubenswrapper[4870]: I0130 08:37:46.074720 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:37:46 crc kubenswrapper[4870]: E0130 08:37:46.075366 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.038354 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8td6r"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.047566 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xrsjh"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.057360 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9d1f-account-create-update-mffzg"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.067575 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6de9-account-create-update-nwcgl"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.083911 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8td6r"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.090176 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9d1f-account-create-update-mffzg"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.098529 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xrsjh"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.115280 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-937e-account-create-update-6w49r"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.122653 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6de9-account-create-update-nwcgl"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.130825 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-937e-account-create-update-6w49r"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.138227 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0515-account-create-update-rln5d"] Jan 30 08:37:48 crc kubenswrapper[4870]: I0130 08:37:48.146368 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0515-account-create-update-rln5d"] Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.086124 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="051874aa-a01e-40bf-a987-a830886ea878" path="/var/lib/kubelet/pods/051874aa-a01e-40bf-a987-a830886ea878/volumes" Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.087374 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e1f740-4393-4ba2-8242-fb863196cb02" path="/var/lib/kubelet/pods/17e1f740-4393-4ba2-8242-fb863196cb02/volumes" Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.088394 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19155d05-01da-4e21-96c2-f23662f8f785" path="/var/lib/kubelet/pods/19155d05-01da-4e21-96c2-f23662f8f785/volumes" Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.089167 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6566e49-850d-460e-9a22-9bfd7384f494" path="/var/lib/kubelet/pods/b6566e49-850d-460e-9a22-9bfd7384f494/volumes" Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.089807 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc35112-b552-434a-b702-26c53cbf5574" path="/var/lib/kubelet/pods/dfc35112-b552-434a-b702-26c53cbf5574/volumes" Jan 30 08:37:50 crc kubenswrapper[4870]: I0130 08:37:50.090456 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb61b735-bf9c-4bf5-a5cf-1948435af72e" path="/var/lib/kubelet/pods/eb61b735-bf9c-4bf5-a5cf-1948435af72e/volumes" Jan 30 08:37:53 crc kubenswrapper[4870]: I0130 08:37:53.049545 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-gbfzh"] Jan 30 08:37:53 crc kubenswrapper[4870]: I0130 08:37:53.057311 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-gbfzh"] Jan 30 08:37:54 crc kubenswrapper[4870]: I0130 08:37:54.039396 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f6r68"] Jan 30 08:37:54 crc kubenswrapper[4870]: I0130 08:37:54.052971 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f6r68"] Jan 30 08:37:54 crc kubenswrapper[4870]: I0130 08:37:54.085740 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881527d5-776b-4639-9306-895d1e370abd" path="/var/lib/kubelet/pods/881527d5-776b-4639-9306-895d1e370abd/volumes" Jan 30 08:37:54 crc kubenswrapper[4870]: I0130 08:37:54.093854 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8637667-8b7e-455e-8ba9-b6291574e4ce" path="/var/lib/kubelet/pods/e8637667-8b7e-455e-8ba9-b6291574e4ce/volumes" Jan 30 08:37:59 crc kubenswrapper[4870]: I0130 08:37:59.074598 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:37:59 crc kubenswrapper[4870]: E0130 08:37:59.075337 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:38:14 crc kubenswrapper[4870]: I0130 08:38:14.077838 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:38:14 crc kubenswrapper[4870]: E0130 08:38:14.078647 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:38:29 crc kubenswrapper[4870]: I0130 08:38:29.075532 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:38:29 crc kubenswrapper[4870]: E0130 08:38:29.077630 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.047080 4870 scope.go:117] "RemoveContainer" containerID="0e37f5a9ce757405b6d3e4a3c9aee5ca81c0dd18541f09603a2ca8623a81a084" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.111544 4870 scope.go:117] "RemoveContainer" containerID="ebb6defef32112bcd4f761a254fe06dd72ca1e2b11d0f09023e3983d12f747be" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.147018 4870 scope.go:117] "RemoveContainer" containerID="c7c08f4bc1bd775e569c12ce6f45113dd74be7d4b1436663db01b3cc4e31c119" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.214673 4870 scope.go:117] "RemoveContainer" containerID="a86a16c99cbecfe80af026afdf8bc6eec15eafc6658e69be7a63babe7a18aa00" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.267077 4870 scope.go:117] "RemoveContainer" containerID="c19fa8ba72448fbf848d632a4b2c87c38ba00d3573897003f02d36a9263593ff" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.308537 4870 scope.go:117] "RemoveContainer" containerID="cb21d97629ca986697c0491abe322efbe6175a053c82fd76648eaa5b827fb2cc" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.357001 4870 scope.go:117] "RemoveContainer" containerID="6dcb2a606401562e049d19a34d68af34e28fc99c34413a4f7cfddf60bc5211ee" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.376332 4870 scope.go:117] "RemoveContainer" containerID="96c828944b59ded4cdb603b725476894266a7134b9d788fbea6f5b49b309942a" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.394010 4870 scope.go:117] "RemoveContainer" containerID="b1933043ebcbf2051360c783e7b0fa2a563a6c4cee962802cf9d526f5fcd348c" Jan 30 08:38:36 crc kubenswrapper[4870]: I0130 08:38:36.416990 4870 scope.go:117] "RemoveContainer" containerID="ce2685881a857cd53a444b13c2f7aef4bd6f5c6b26f0a8cbc8a0c60a7f826c60" Jan 30 08:38:40 crc kubenswrapper[4870]: I0130 08:38:40.075383 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:38:40 crc kubenswrapper[4870]: E0130 08:38:40.076361 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:38:43 crc kubenswrapper[4870]: I0130 08:38:43.060991 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g4m9m"] Jan 30 08:38:43 crc kubenswrapper[4870]: I0130 08:38:43.069980 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g4m9m"] Jan 30 08:38:44 crc kubenswrapper[4870]: I0130 08:38:44.096729 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b91a69-f8ad-4d1d-a47d-c1921071c71a" path="/var/lib/kubelet/pods/b9b91a69-f8ad-4d1d-a47d-c1921071c71a/volumes" Jan 30 08:38:54 crc kubenswrapper[4870]: I0130 08:38:54.074846 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:38:54 crc kubenswrapper[4870]: E0130 08:38:54.077565 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:38:57 crc kubenswrapper[4870]: I0130 08:38:57.068649 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-b57k5"] Jan 30 08:38:57 crc kubenswrapper[4870]: I0130 08:38:57.080234 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-b57k5"] Jan 30 08:38:58 crc kubenswrapper[4870]: I0130 08:38:58.085963 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1435e0c6-e24a-44d4-bf78-3e5300e23cdd" path="/var/lib/kubelet/pods/1435e0c6-e24a-44d4-bf78-3e5300e23cdd/volumes" Jan 30 08:39:07 crc kubenswrapper[4870]: I0130 08:39:07.075217 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:39:07 crc kubenswrapper[4870]: E0130 08:39:07.075865 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:39:08 crc kubenswrapper[4870]: I0130 08:39:08.048973 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-d2mx7"] Jan 30 08:39:08 crc kubenswrapper[4870]: I0130 08:39:08.065092 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9g27p"] Jan 30 08:39:08 crc kubenswrapper[4870]: I0130 08:39:08.103479 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-d2mx7"] Jan 30 08:39:08 crc kubenswrapper[4870]: I0130 08:39:08.103537 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9g27p"] Jan 30 08:39:09 crc kubenswrapper[4870]: I0130 08:39:09.033860 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9mjj4"] Jan 30 08:39:09 crc kubenswrapper[4870]: I0130 08:39:09.043056 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9mjj4"] Jan 30 08:39:10 crc kubenswrapper[4870]: I0130 08:39:10.089810 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505df376-c8bc-44ce-9c14-8cf94730c550" path="/var/lib/kubelet/pods/505df376-c8bc-44ce-9c14-8cf94730c550/volumes" Jan 30 08:39:10 crc kubenswrapper[4870]: I0130 08:39:10.090623 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685bde78-dea1-4864-a825-af176178bd11" path="/var/lib/kubelet/pods/685bde78-dea1-4864-a825-af176178bd11/volumes" Jan 30 08:39:10 crc kubenswrapper[4870]: I0130 08:39:10.091598 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bd649e-5c3c-495f-933f-3b516167cbd2" path="/var/lib/kubelet/pods/c3bd649e-5c3c-495f-933f-3b516167cbd2/volumes" Jan 30 08:39:12 crc kubenswrapper[4870]: I0130 08:39:12.032488 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tssp8"] Jan 30 08:39:12 crc kubenswrapper[4870]: I0130 08:39:12.041432 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tssp8"] Jan 30 08:39:12 crc kubenswrapper[4870]: I0130 08:39:12.086138 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd09a42-14b6-4161-ba2a-82c4cf4f5983" path="/var/lib/kubelet/pods/edd09a42-14b6-4161-ba2a-82c4cf4f5983/volumes" Jan 30 08:39:18 crc kubenswrapper[4870]: I0130 08:39:18.075118 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:39:18 crc kubenswrapper[4870]: E0130 08:39:18.075838 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:39:31 crc kubenswrapper[4870]: I0130 08:39:31.075579 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:39:31 crc kubenswrapper[4870]: E0130 08:39:31.076464 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:39:35 crc kubenswrapper[4870]: I0130 08:39:35.003775 4870 generic.go:334] "Generic (PLEG): container finished" podID="9bef3cd3-94ab-486e-91de-c0ede57769d8" containerID="36acfc5bf960402457c7f8cc9040b5a8f64a76024ba8309a1355304bfe83c1d9" exitCode=0 Jan 30 08:39:35 crc kubenswrapper[4870]: I0130 08:39:35.003904 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" event={"ID":"9bef3cd3-94ab-486e-91de-c0ede57769d8","Type":"ContainerDied","Data":"36acfc5bf960402457c7f8cc9040b5a8f64a76024ba8309a1355304bfe83c1d9"} Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.439343 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.506790 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory\") pod \"9bef3cd3-94ab-486e-91de-c0ede57769d8\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.506859 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtqt9\" (UniqueName: \"kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9\") pod \"9bef3cd3-94ab-486e-91de-c0ede57769d8\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.506916 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam\") pod \"9bef3cd3-94ab-486e-91de-c0ede57769d8\" (UID: \"9bef3cd3-94ab-486e-91de-c0ede57769d8\") " Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.513700 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9" (OuterVolumeSpecName: "kube-api-access-gtqt9") pod "9bef3cd3-94ab-486e-91de-c0ede57769d8" (UID: "9bef3cd3-94ab-486e-91de-c0ede57769d8"). InnerVolumeSpecName "kube-api-access-gtqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.542829 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9bef3cd3-94ab-486e-91de-c0ede57769d8" (UID: "9bef3cd3-94ab-486e-91de-c0ede57769d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.547075 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory" (OuterVolumeSpecName: "inventory") pod "9bef3cd3-94ab-486e-91de-c0ede57769d8" (UID: "9bef3cd3-94ab-486e-91de-c0ede57769d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.609080 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.609280 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtqt9\" (UniqueName: \"kubernetes.io/projected/9bef3cd3-94ab-486e-91de-c0ede57769d8-kube-api-access-gtqt9\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.609345 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bef3cd3-94ab-486e-91de-c0ede57769d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.612107 4870 scope.go:117] "RemoveContainer" containerID="3e1279140ba8261786354ed9fbacfe2a2a43a2b8decaba7ca7c7b15754ed7ff9" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.650536 4870 scope.go:117] "RemoveContainer" containerID="423e4f8207599a836d08eca85be2c21680c69e731edaed6ac9d59c605d325bfb" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.800911 4870 scope.go:117] "RemoveContainer" containerID="c723dc182803022ba9e618ac6407cbccb617a7c5a0a43457386f580c7a154614" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.845681 4870 scope.go:117] "RemoveContainer" containerID="ac1cfe0654d6d9f59d0d7bba982a578597204c7a7dcbab5f91122bf878031c77" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.873007 4870 scope.go:117] "RemoveContainer" containerID="51fd04d1413a7bb8dd1010fcf50ab478d7211a73c87542e70aaae3ce82cc9053" Jan 30 08:39:36 crc kubenswrapper[4870]: I0130 08:39:36.915668 4870 scope.go:117] "RemoveContainer" containerID="85f1049088e388e69d6da33f4eab9143943bc4d4ba2179d9093657152d474310" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.026730 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" event={"ID":"9bef3cd3-94ab-486e-91de-c0ede57769d8","Type":"ContainerDied","Data":"626d115ac205077a01bbe8f25312875b05af3e1a0b1ae6dc536bf8f8aea4f69b"} Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.026778 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626d115ac205077a01bbe8f25312875b05af3e1a0b1ae6dc536bf8f8aea4f69b" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.026839 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p67q7" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.111291 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh"] Jan 30 08:39:37 crc kubenswrapper[4870]: E0130 08:39:37.111907 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bef3cd3-94ab-486e-91de-c0ede57769d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.111935 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bef3cd3-94ab-486e-91de-c0ede57769d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.112177 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bef3cd3-94ab-486e-91de-c0ede57769d8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.113066 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.114847 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.115413 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.115503 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.115584 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.140678 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh"] Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.230780 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xplnt\" (UniqueName: \"kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.231037 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.231261 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.333097 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xplnt\" (UniqueName: \"kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.333263 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.333509 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.338838 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.338665 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.349918 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xplnt\" (UniqueName: \"kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-948rh\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:37 crc kubenswrapper[4870]: I0130 08:39:37.434843 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:39:38 crc kubenswrapper[4870]: W0130 08:39:38.054158 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eea19c9_87be_4160_8c11_c7ecd13cf088.slice/crio-1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776 WatchSource:0}: Error finding container 1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776: Status 404 returned error can't find the container with id 1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776 Jan 30 08:39:38 crc kubenswrapper[4870]: I0130 08:39:38.072697 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh"] Jan 30 08:39:39 crc kubenswrapper[4870]: I0130 08:39:39.053416 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" event={"ID":"1eea19c9-87be-4160-8c11-c7ecd13cf088","Type":"ContainerStarted","Data":"1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776"} Jan 30 08:39:41 crc kubenswrapper[4870]: I0130 08:39:41.077493 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" event={"ID":"1eea19c9-87be-4160-8c11-c7ecd13cf088","Type":"ContainerStarted","Data":"7932ebc87656bf0ae01e190d06d498a15ca8251a6e3095a4f845c9e27a4ab873"} Jan 30 08:39:41 crc kubenswrapper[4870]: I0130 08:39:41.096539 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" podStartSLOduration=2.297782708 podStartE2EDuration="4.096518885s" podCreationTimestamp="2026-01-30 08:39:37 +0000 UTC" firstStartedPulling="2026-01-30 08:39:38.056584454 +0000 UTC m=+1816.752131563" lastFinishedPulling="2026-01-30 08:39:39.855320631 +0000 UTC m=+1818.550867740" observedRunningTime="2026-01-30 08:39:41.093699706 +0000 UTC m=+1819.789246815" watchObservedRunningTime="2026-01-30 08:39:41.096518885 +0000 UTC m=+1819.792066004" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.218792 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.221241 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.229384 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.343113 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.343201 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.343420 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdd5\" (UniqueName: \"kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.446114 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.446583 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.446907 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdd5\" (UniqueName: \"kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.446919 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.447291 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.472849 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdd5\" (UniqueName: \"kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5\") pod \"redhat-marketplace-54dp6\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:42 crc kubenswrapper[4870]: I0130 08:39:42.540636 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:43 crc kubenswrapper[4870]: I0130 08:39:43.003452 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:43 crc kubenswrapper[4870]: I0130 08:39:43.108579 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerStarted","Data":"5d13e2421f8c4e836dce39d4509ef089cfed99d261e72152a996ec22cfbe9f95"} Jan 30 08:39:44 crc kubenswrapper[4870]: I0130 08:39:44.123216 4870 generic.go:334] "Generic (PLEG): container finished" podID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerID="a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d" exitCode=0 Jan 30 08:39:44 crc kubenswrapper[4870]: I0130 08:39:44.123276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerDied","Data":"a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d"} Jan 30 08:39:46 crc kubenswrapper[4870]: I0130 08:39:46.075315 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:39:46 crc kubenswrapper[4870]: E0130 08:39:46.076155 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:39:46 crc kubenswrapper[4870]: I0130 08:39:46.145204 4870 generic.go:334] "Generic (PLEG): container finished" podID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerID="ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8" exitCode=0 Jan 30 08:39:46 crc kubenswrapper[4870]: I0130 08:39:46.145244 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerDied","Data":"ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8"} Jan 30 08:39:47 crc kubenswrapper[4870]: I0130 08:39:47.156085 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerStarted","Data":"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8"} Jan 30 08:39:47 crc kubenswrapper[4870]: I0130 08:39:47.181663 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-54dp6" podStartSLOduration=2.669611535 podStartE2EDuration="5.181645517s" podCreationTimestamp="2026-01-30 08:39:42 +0000 UTC" firstStartedPulling="2026-01-30 08:39:44.125951545 +0000 UTC m=+1822.821498654" lastFinishedPulling="2026-01-30 08:39:46.637985527 +0000 UTC m=+1825.333532636" observedRunningTime="2026-01-30 08:39:47.178242331 +0000 UTC m=+1825.873789440" watchObservedRunningTime="2026-01-30 08:39:47.181645517 +0000 UTC m=+1825.877192626" Jan 30 08:39:52 crc kubenswrapper[4870]: I0130 08:39:52.541044 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:52 crc kubenswrapper[4870]: I0130 08:39:52.541549 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:52 crc kubenswrapper[4870]: I0130 08:39:52.595788 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:53 crc kubenswrapper[4870]: I0130 08:39:53.249045 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:53 crc kubenswrapper[4870]: I0130 08:39:53.300774 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:55 crc kubenswrapper[4870]: I0130 08:39:55.224406 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-54dp6" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="registry-server" containerID="cri-o://80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8" gracePeriod=2 Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.202611 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.234375 4870 generic.go:334] "Generic (PLEG): container finished" podID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerID="80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8" exitCode=0 Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.234418 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerDied","Data":"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8"} Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.234443 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-54dp6" event={"ID":"0419d51d-7b10-4e0f-b6ba-196fafeb8df2","Type":"ContainerDied","Data":"5d13e2421f8c4e836dce39d4509ef089cfed99d261e72152a996ec22cfbe9f95"} Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.234460 4870 scope.go:117] "RemoveContainer" containerID="80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.234582 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-54dp6" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.259995 4870 scope.go:117] "RemoveContainer" containerID="ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.283728 4870 scope.go:117] "RemoveContainer" containerID="a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.318464 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content\") pod \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.318511 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities\") pod \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.318639 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wdd5\" (UniqueName: \"kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5\") pod \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\" (UID: \"0419d51d-7b10-4e0f-b6ba-196fafeb8df2\") " Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.319765 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities" (OuterVolumeSpecName: "utilities") pod "0419d51d-7b10-4e0f-b6ba-196fafeb8df2" (UID: "0419d51d-7b10-4e0f-b6ba-196fafeb8df2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.327743 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5" (OuterVolumeSpecName: "kube-api-access-4wdd5") pod "0419d51d-7b10-4e0f-b6ba-196fafeb8df2" (UID: "0419d51d-7b10-4e0f-b6ba-196fafeb8df2"). InnerVolumeSpecName "kube-api-access-4wdd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.346081 4870 scope.go:117] "RemoveContainer" containerID="80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8" Jan 30 08:39:56 crc kubenswrapper[4870]: E0130 08:39:56.346701 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8\": container with ID starting with 80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8 not found: ID does not exist" containerID="80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.346751 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0419d51d-7b10-4e0f-b6ba-196fafeb8df2" (UID: "0419d51d-7b10-4e0f-b6ba-196fafeb8df2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.346759 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8"} err="failed to get container status \"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8\": rpc error: code = NotFound desc = could not find container \"80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8\": container with ID starting with 80b1b36e8da8d957d286029da9530e105f323a31e33617a692d7d80c253a4da8 not found: ID does not exist" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.346795 4870 scope.go:117] "RemoveContainer" containerID="ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8" Jan 30 08:39:56 crc kubenswrapper[4870]: E0130 08:39:56.347237 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8\": container with ID starting with ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8 not found: ID does not exist" containerID="ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.347278 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8"} err="failed to get container status \"ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8\": rpc error: code = NotFound desc = could not find container \"ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8\": container with ID starting with ae4aae4d2815ded1520810f050e6a1cae92f3b27778430a63762bed46deb47e8 not found: ID does not exist" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.347292 4870 scope.go:117] "RemoveContainer" containerID="a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d" Jan 30 08:39:56 crc kubenswrapper[4870]: E0130 08:39:56.347565 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d\": container with ID starting with a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d not found: ID does not exist" containerID="a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.347587 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d"} err="failed to get container status \"a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d\": rpc error: code = NotFound desc = could not find container \"a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d\": container with ID starting with a125d3f7cbe7f9ef0dc61cf3a2660410f3c17fb97c49bd74531df00e9d460f3d not found: ID does not exist" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.421701 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wdd5\" (UniqueName: \"kubernetes.io/projected/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-kube-api-access-4wdd5\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.421741 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.421755 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0419d51d-7b10-4e0f-b6ba-196fafeb8df2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.570063 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:56 crc kubenswrapper[4870]: I0130 08:39:56.578532 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-54dp6"] Jan 30 08:39:58 crc kubenswrapper[4870]: I0130 08:39:58.090269 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" path="/var/lib/kubelet/pods/0419d51d-7b10-4e0f-b6ba-196fafeb8df2/volumes" Jan 30 08:40:00 crc kubenswrapper[4870]: I0130 08:40:00.075601 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:40:00 crc kubenswrapper[4870]: E0130 08:40:00.076435 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.074765 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mz9qm"] Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.088988 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rxztf"] Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.096969 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p626s"] Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.105131 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p626s"] Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.113212 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mz9qm"] Jan 30 08:40:01 crc kubenswrapper[4870]: I0130 08:40:01.121434 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rxztf"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.039094 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-89bf-account-create-update-s9p8t"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.053593 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7c52-account-create-update-bc4lx"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.065268 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-89bf-account-create-update-s9p8t"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.091659 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0467c513-d47e-4251-a042-74a1f0a3ba8e" path="/var/lib/kubelet/pods/0467c513-d47e-4251-a042-74a1f0a3ba8e/volumes" Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.092474 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd82862-2bef-4d86-be4e-38f670a252bd" path="/var/lib/kubelet/pods/6cd82862-2bef-4d86-be4e-38f670a252bd/volumes" Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.093246 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82e1e2b-e78e-4b8f-8303-2ea82b24bf28" path="/var/lib/kubelet/pods/b82e1e2b-e78e-4b8f-8303-2ea82b24bf28/volumes" Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.094194 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981" path="/var/lib/kubelet/pods/f41a5ec0-d6c7-47ab-b69f-c6c2a8bc4981/volumes" Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.095567 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7c52-account-create-update-bc4lx"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.095608 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9204-account-create-update-pczk5"] Jan 30 08:40:02 crc kubenswrapper[4870]: I0130 08:40:02.095624 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9204-account-create-update-pczk5"] Jan 30 08:40:04 crc kubenswrapper[4870]: I0130 08:40:04.085077 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa80552-6dc1-43b4-ba32-8fca58595c32" path="/var/lib/kubelet/pods/9aa80552-6dc1-43b4-ba32-8fca58595c32/volumes" Jan 30 08:40:04 crc kubenswrapper[4870]: I0130 08:40:04.085860 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf298cb-af81-4272-aacd-2d1342eab106" path="/var/lib/kubelet/pods/adf298cb-af81-4272-aacd-2d1342eab106/volumes" Jan 30 08:40:15 crc kubenswrapper[4870]: I0130 08:40:15.075932 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:40:15 crc kubenswrapper[4870]: E0130 08:40:15.077307 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:40:29 crc kubenswrapper[4870]: I0130 08:40:29.075381 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:40:29 crc kubenswrapper[4870]: I0130 08:40:29.547969 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2"} Jan 30 08:40:34 crc kubenswrapper[4870]: I0130 08:40:34.039295 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gs8vz"] Jan 30 08:40:34 crc kubenswrapper[4870]: I0130 08:40:34.049144 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gs8vz"] Jan 30 08:40:34 crc kubenswrapper[4870]: I0130 08:40:34.115180 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463149ce-687b-479c-ab61-030371f69acb" path="/var/lib/kubelet/pods/463149ce-687b-479c-ab61-030371f69acb/volumes" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.035132 4870 scope.go:117] "RemoveContainer" containerID="b3747f1a7b0dcf93ef3e9971ceb218b892bb2531c608e6a07760d677c25d7633" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.111815 4870 scope.go:117] "RemoveContainer" containerID="23a36de41e3e5413c9d4a8e53e9d9062761ceb2d5ea6dc50cc6414dd812317b7" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.156990 4870 scope.go:117] "RemoveContainer" containerID="ebc3f13bad52a8c63665a782767852e60e31712851103659b19d6a855c623701" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.238166 4870 scope.go:117] "RemoveContainer" containerID="e7588860011aa90e39e44c8b147a927646ababcde64482fc80c549dc156bfaf7" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.264435 4870 scope.go:117] "RemoveContainer" containerID="10d8adf976aee141ddedf0f0b7d4a560074ff0040c0d225d7fde8dac560cebcd" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.312571 4870 scope.go:117] "RemoveContainer" containerID="ad4987bcd683a82b2ef435208c93f9f9d4904561809fe23aa9fd681008a558c6" Jan 30 08:40:37 crc kubenswrapper[4870]: I0130 08:40:37.352637 4870 scope.go:117] "RemoveContainer" containerID="046d4010b0f900fe2cbd28328fdfa8554886e3c18049908c92ba7d45ff824b80" Jan 30 08:40:53 crc kubenswrapper[4870]: I0130 08:40:53.051321 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrk8x"] Jan 30 08:40:53 crc kubenswrapper[4870]: I0130 08:40:53.065235 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vrk8x"] Jan 30 08:40:54 crc kubenswrapper[4870]: I0130 08:40:54.085016 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5" path="/var/lib/kubelet/pods/8e3f03b1-ce9f-4f1d-8bb9-eecb941268c5/volumes" Jan 30 08:40:57 crc kubenswrapper[4870]: I0130 08:40:57.870379 4870 generic.go:334] "Generic (PLEG): container finished" podID="1eea19c9-87be-4160-8c11-c7ecd13cf088" containerID="7932ebc87656bf0ae01e190d06d498a15ca8251a6e3095a4f845c9e27a4ab873" exitCode=0 Jan 30 08:40:57 crc kubenswrapper[4870]: I0130 08:40:57.870473 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" event={"ID":"1eea19c9-87be-4160-8c11-c7ecd13cf088","Type":"ContainerDied","Data":"7932ebc87656bf0ae01e190d06d498a15ca8251a6e3095a4f845c9e27a4ab873"} Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.279605 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.353253 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory\") pod \"1eea19c9-87be-4160-8c11-c7ecd13cf088\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.353477 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xplnt\" (UniqueName: \"kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt\") pod \"1eea19c9-87be-4160-8c11-c7ecd13cf088\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.353589 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam\") pod \"1eea19c9-87be-4160-8c11-c7ecd13cf088\" (UID: \"1eea19c9-87be-4160-8c11-c7ecd13cf088\") " Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.359537 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt" (OuterVolumeSpecName: "kube-api-access-xplnt") pod "1eea19c9-87be-4160-8c11-c7ecd13cf088" (UID: "1eea19c9-87be-4160-8c11-c7ecd13cf088"). InnerVolumeSpecName "kube-api-access-xplnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.385154 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory" (OuterVolumeSpecName: "inventory") pod "1eea19c9-87be-4160-8c11-c7ecd13cf088" (UID: "1eea19c9-87be-4160-8c11-c7ecd13cf088"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.385904 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1eea19c9-87be-4160-8c11-c7ecd13cf088" (UID: "1eea19c9-87be-4160-8c11-c7ecd13cf088"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.456301 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xplnt\" (UniqueName: \"kubernetes.io/projected/1eea19c9-87be-4160-8c11-c7ecd13cf088-kube-api-access-xplnt\") on node \"crc\" DevicePath \"\"" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.456349 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.456363 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1eea19c9-87be-4160-8c11-c7ecd13cf088-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.891035 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" event={"ID":"1eea19c9-87be-4160-8c11-c7ecd13cf088","Type":"ContainerDied","Data":"1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776"} Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.891072 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1770fb330557b332b33d9381d3b29b9bcd4d66980b06a1acc3c5507ce1e08776" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.891084 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-948rh" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.978348 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh"] Jan 30 08:40:59 crc kubenswrapper[4870]: E0130 08:40:59.978703 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="extract-content" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.978718 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="extract-content" Jan 30 08:40:59 crc kubenswrapper[4870]: E0130 08:40:59.978736 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eea19c9-87be-4160-8c11-c7ecd13cf088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.978744 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eea19c9-87be-4160-8c11-c7ecd13cf088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:40:59 crc kubenswrapper[4870]: E0130 08:40:59.978756 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="registry-server" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.978762 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="registry-server" Jan 30 08:40:59 crc kubenswrapper[4870]: E0130 08:40:59.978778 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="extract-utilities" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.978783 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="extract-utilities" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.979026 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eea19c9-87be-4160-8c11-c7ecd13cf088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.979053 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0419d51d-7b10-4e0f-b6ba-196fafeb8df2" containerName="registry-server" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.979656 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.982793 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.982910 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.983022 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.983200 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:40:59 crc kubenswrapper[4870]: I0130 08:40:59.991044 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh"] Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.068247 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.068381 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjj6\" (UniqueName: \"kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.068555 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.170082 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.170218 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjj6\" (UniqueName: \"kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.170293 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.174419 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.174485 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.187115 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjj6\" (UniqueName: \"kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.294231 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.876095 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh"] Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.882016 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:41:00 crc kubenswrapper[4870]: I0130 08:41:00.919190 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" event={"ID":"2f708fca-b1a9-432a-acbe-df74341208d2","Type":"ContainerStarted","Data":"025650171f5d69cc5598b7176854c53e1bb04509791618e6a95971b09eded094"} Jan 30 08:41:02 crc kubenswrapper[4870]: I0130 08:41:02.038274 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m882v"] Jan 30 08:41:02 crc kubenswrapper[4870]: I0130 08:41:02.044447 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m882v"] Jan 30 08:41:02 crc kubenswrapper[4870]: I0130 08:41:02.087958 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7d1e35-e72c-4a05-8a4a-89647f93a26c" path="/var/lib/kubelet/pods/df7d1e35-e72c-4a05-8a4a-89647f93a26c/volumes" Jan 30 08:41:04 crc kubenswrapper[4870]: I0130 08:41:04.958339 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" event={"ID":"2f708fca-b1a9-432a-acbe-df74341208d2","Type":"ContainerStarted","Data":"4c8c236a6e47501efdc59b70f477bf862ee67ea67d633a312fce41dcf1117d23"} Jan 30 08:41:06 crc kubenswrapper[4870]: I0130 08:41:06.004833 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" podStartSLOduration=4.20319988 podStartE2EDuration="7.004806531s" podCreationTimestamp="2026-01-30 08:40:59 +0000 UTC" firstStartedPulling="2026-01-30 08:41:00.88178233 +0000 UTC m=+1899.577329439" lastFinishedPulling="2026-01-30 08:41:03.683388981 +0000 UTC m=+1902.378936090" observedRunningTime="2026-01-30 08:41:05.989534455 +0000 UTC m=+1904.685081604" watchObservedRunningTime="2026-01-30 08:41:06.004806531 +0000 UTC m=+1904.700353680" Jan 30 08:41:10 crc kubenswrapper[4870]: I0130 08:41:10.012146 4870 generic.go:334] "Generic (PLEG): container finished" podID="2f708fca-b1a9-432a-acbe-df74341208d2" containerID="4c8c236a6e47501efdc59b70f477bf862ee67ea67d633a312fce41dcf1117d23" exitCode=0 Jan 30 08:41:10 crc kubenswrapper[4870]: I0130 08:41:10.012248 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" event={"ID":"2f708fca-b1a9-432a-acbe-df74341208d2","Type":"ContainerDied","Data":"4c8c236a6e47501efdc59b70f477bf862ee67ea67d633a312fce41dcf1117d23"} Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.438438 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.526183 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory\") pod \"2f708fca-b1a9-432a-acbe-df74341208d2\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.526269 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam\") pod \"2f708fca-b1a9-432a-acbe-df74341208d2\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.526393 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjj6\" (UniqueName: \"kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6\") pod \"2f708fca-b1a9-432a-acbe-df74341208d2\" (UID: \"2f708fca-b1a9-432a-acbe-df74341208d2\") " Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.533175 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6" (OuterVolumeSpecName: "kube-api-access-nfjj6") pod "2f708fca-b1a9-432a-acbe-df74341208d2" (UID: "2f708fca-b1a9-432a-acbe-df74341208d2"). InnerVolumeSpecName "kube-api-access-nfjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.555063 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2f708fca-b1a9-432a-acbe-df74341208d2" (UID: "2f708fca-b1a9-432a-acbe-df74341208d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.555099 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory" (OuterVolumeSpecName: "inventory") pod "2f708fca-b1a9-432a-acbe-df74341208d2" (UID: "2f708fca-b1a9-432a-acbe-df74341208d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.629029 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjj6\" (UniqueName: \"kubernetes.io/projected/2f708fca-b1a9-432a-acbe-df74341208d2-kube-api-access-nfjj6\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.629062 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:11 crc kubenswrapper[4870]: I0130 08:41:11.629072 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2f708fca-b1a9-432a-acbe-df74341208d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.030740 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" event={"ID":"2f708fca-b1a9-432a-acbe-df74341208d2","Type":"ContainerDied","Data":"025650171f5d69cc5598b7176854c53e1bb04509791618e6a95971b09eded094"} Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.030781 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025650171f5d69cc5598b7176854c53e1bb04509791618e6a95971b09eded094" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.030797 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.125995 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm"] Jan 30 08:41:12 crc kubenswrapper[4870]: E0130 08:41:12.126570 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f708fca-b1a9-432a-acbe-df74341208d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.126593 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f708fca-b1a9-432a-acbe-df74341208d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.126829 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f708fca-b1a9-432a-acbe-df74341208d2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.127674 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.129668 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.129973 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.130260 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.130377 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.138814 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm"] Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.139927 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.139981 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.140009 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlph\" (UniqueName: \"kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.241053 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.241362 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.241425 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlph\" (UniqueName: \"kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.245257 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.250312 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.258314 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlph\" (UniqueName: \"kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z4hkm\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.444958 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:12 crc kubenswrapper[4870]: I0130 08:41:12.984533 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm"] Jan 30 08:41:13 crc kubenswrapper[4870]: I0130 08:41:13.044661 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" event={"ID":"82fb960a-335c-4d35-baed-122cd1cb515d","Type":"ContainerStarted","Data":"ac21d1032be1f0066a16137837559fa556ce16b8cc044603da108adf8f506076"} Jan 30 08:41:15 crc kubenswrapper[4870]: I0130 08:41:15.059613 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" event={"ID":"82fb960a-335c-4d35-baed-122cd1cb515d","Type":"ContainerStarted","Data":"96fa2735a8adb0378e99eb607ea919b72d94c98634fe9c945d63764c78f1d5a5"} Jan 30 08:41:15 crc kubenswrapper[4870]: I0130 08:41:15.077829 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" podStartSLOduration=2.364217616 podStartE2EDuration="3.07781136s" podCreationTimestamp="2026-01-30 08:41:12 +0000 UTC" firstStartedPulling="2026-01-30 08:41:12.997840155 +0000 UTC m=+1911.693387264" lastFinishedPulling="2026-01-30 08:41:13.711433899 +0000 UTC m=+1912.406981008" observedRunningTime="2026-01-30 08:41:15.072169234 +0000 UTC m=+1913.767716343" watchObservedRunningTime="2026-01-30 08:41:15.07781136 +0000 UTC m=+1913.773358469" Jan 30 08:41:37 crc kubenswrapper[4870]: I0130 08:41:37.558986 4870 scope.go:117] "RemoveContainer" containerID="d505e04dc454937c02de4ea80fb1b30e9ec281deb651bdc207ab606295f95619" Jan 30 08:41:37 crc kubenswrapper[4870]: I0130 08:41:37.629901 4870 scope.go:117] "RemoveContainer" containerID="8042d48e5c92127670e964e37628c795eb4a833864e07f5be3a23644c40ab2aa" Jan 30 08:41:43 crc kubenswrapper[4870]: I0130 08:41:43.052543 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hhwc4"] Jan 30 08:41:43 crc kubenswrapper[4870]: I0130 08:41:43.064772 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hhwc4"] Jan 30 08:41:44 crc kubenswrapper[4870]: I0130 08:41:44.089956 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1dfb454-58dc-4c83-b25e-cabaab6cb747" path="/var/lib/kubelet/pods/c1dfb454-58dc-4c83-b25e-cabaab6cb747/volumes" Jan 30 08:41:53 crc kubenswrapper[4870]: I0130 08:41:53.792470 4870 generic.go:334] "Generic (PLEG): container finished" podID="82fb960a-335c-4d35-baed-122cd1cb515d" containerID="96fa2735a8adb0378e99eb607ea919b72d94c98634fe9c945d63764c78f1d5a5" exitCode=0 Jan 30 08:41:53 crc kubenswrapper[4870]: I0130 08:41:53.793029 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" event={"ID":"82fb960a-335c-4d35-baed-122cd1cb515d","Type":"ContainerDied","Data":"96fa2735a8adb0378e99eb607ea919b72d94c98634fe9c945d63764c78f1d5a5"} Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.262333 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.406833 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam\") pod \"82fb960a-335c-4d35-baed-122cd1cb515d\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.406947 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory\") pod \"82fb960a-335c-4d35-baed-122cd1cb515d\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.406985 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlph\" (UniqueName: \"kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph\") pod \"82fb960a-335c-4d35-baed-122cd1cb515d\" (UID: \"82fb960a-335c-4d35-baed-122cd1cb515d\") " Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.434126 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph" (OuterVolumeSpecName: "kube-api-access-7rlph") pod "82fb960a-335c-4d35-baed-122cd1cb515d" (UID: "82fb960a-335c-4d35-baed-122cd1cb515d"). InnerVolumeSpecName "kube-api-access-7rlph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.460076 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory" (OuterVolumeSpecName: "inventory") pod "82fb960a-335c-4d35-baed-122cd1cb515d" (UID: "82fb960a-335c-4d35-baed-122cd1cb515d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.471444 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "82fb960a-335c-4d35-baed-122cd1cb515d" (UID: "82fb960a-335c-4d35-baed-122cd1cb515d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.509664 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.509728 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82fb960a-335c-4d35-baed-122cd1cb515d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.509742 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rlph\" (UniqueName: \"kubernetes.io/projected/82fb960a-335c-4d35-baed-122cd1cb515d-kube-api-access-7rlph\") on node \"crc\" DevicePath \"\"" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.822446 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" event={"ID":"82fb960a-335c-4d35-baed-122cd1cb515d","Type":"ContainerDied","Data":"ac21d1032be1f0066a16137837559fa556ce16b8cc044603da108adf8f506076"} Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.822498 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac21d1032be1f0066a16137837559fa556ce16b8cc044603da108adf8f506076" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.822560 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z4hkm" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.921419 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n"] Jan 30 08:41:55 crc kubenswrapper[4870]: E0130 08:41:55.921926 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fb960a-335c-4d35-baed-122cd1cb515d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.921948 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fb960a-335c-4d35-baed-122cd1cb515d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.922259 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="82fb960a-335c-4d35-baed-122cd1cb515d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.922993 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.925362 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.926000 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.929776 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.929943 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:41:55 crc kubenswrapper[4870]: I0130 08:41:55.938869 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n"] Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.121617 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.122118 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srxkm\" (UniqueName: \"kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.122414 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.224454 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.224594 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srxkm\" (UniqueName: \"kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.224662 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.229515 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.237097 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.243494 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srxkm\" (UniqueName: \"kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.281599 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:41:56 crc kubenswrapper[4870]: I0130 08:41:56.891282 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n"] Jan 30 08:41:57 crc kubenswrapper[4870]: I0130 08:41:57.840279 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" event={"ID":"f32f4b01-631a-4f4b-8ffb-f0873b819de0","Type":"ContainerStarted","Data":"842d2706f5ff57fb0c4252a173f50ad6ed9ad319ed05f4307b1cbd3bb33828b1"} Jan 30 08:41:57 crc kubenswrapper[4870]: I0130 08:41:57.840631 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" event={"ID":"f32f4b01-631a-4f4b-8ffb-f0873b819de0","Type":"ContainerStarted","Data":"8a351807bd85fefefa0570c8fe5bdbb23839b0f7ee2e93949b4dd32f9ca2828e"} Jan 30 08:41:57 crc kubenswrapper[4870]: I0130 08:41:57.866166 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" podStartSLOduration=2.432303346 podStartE2EDuration="2.86613841s" podCreationTimestamp="2026-01-30 08:41:55 +0000 UTC" firstStartedPulling="2026-01-30 08:41:56.900771408 +0000 UTC m=+1955.596318517" lastFinishedPulling="2026-01-30 08:41:57.334606472 +0000 UTC m=+1956.030153581" observedRunningTime="2026-01-30 08:41:57.854860918 +0000 UTC m=+1956.550408037" watchObservedRunningTime="2026-01-30 08:41:57.86613841 +0000 UTC m=+1956.561685519" Jan 30 08:42:37 crc kubenswrapper[4870]: I0130 08:42:37.713161 4870 scope.go:117] "RemoveContainer" containerID="4b78d78b78d21abcc7506de0b24454a50e055736a3c90f711e671ea39c5653ae" Jan 30 08:42:47 crc kubenswrapper[4870]: I0130 08:42:47.286237 4870 generic.go:334] "Generic (PLEG): container finished" podID="f32f4b01-631a-4f4b-8ffb-f0873b819de0" containerID="842d2706f5ff57fb0c4252a173f50ad6ed9ad319ed05f4307b1cbd3bb33828b1" exitCode=0 Jan 30 08:42:47 crc kubenswrapper[4870]: I0130 08:42:47.286335 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" event={"ID":"f32f4b01-631a-4f4b-8ffb-f0873b819de0","Type":"ContainerDied","Data":"842d2706f5ff57fb0c4252a173f50ad6ed9ad319ed05f4307b1cbd3bb33828b1"} Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.742171 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.789550 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srxkm\" (UniqueName: \"kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm\") pod \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.790239 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory\") pod \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.790512 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam\") pod \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\" (UID: \"f32f4b01-631a-4f4b-8ffb-f0873b819de0\") " Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.797285 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm" (OuterVolumeSpecName: "kube-api-access-srxkm") pod "f32f4b01-631a-4f4b-8ffb-f0873b819de0" (UID: "f32f4b01-631a-4f4b-8ffb-f0873b819de0"). InnerVolumeSpecName "kube-api-access-srxkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.820414 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory" (OuterVolumeSpecName: "inventory") pod "f32f4b01-631a-4f4b-8ffb-f0873b819de0" (UID: "f32f4b01-631a-4f4b-8ffb-f0873b819de0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.825439 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f32f4b01-631a-4f4b-8ffb-f0873b819de0" (UID: "f32f4b01-631a-4f4b-8ffb-f0873b819de0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.895659 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srxkm\" (UniqueName: \"kubernetes.io/projected/f32f4b01-631a-4f4b-8ffb-f0873b819de0-kube-api-access-srxkm\") on node \"crc\" DevicePath \"\"" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.895719 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:42:48 crc kubenswrapper[4870]: I0130 08:42:48.895735 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f32f4b01-631a-4f4b-8ffb-f0873b819de0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.317927 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" event={"ID":"f32f4b01-631a-4f4b-8ffb-f0873b819de0","Type":"ContainerDied","Data":"8a351807bd85fefefa0570c8fe5bdbb23839b0f7ee2e93949b4dd32f9ca2828e"} Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.317968 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a351807bd85fefefa0570c8fe5bdbb23839b0f7ee2e93949b4dd32f9ca2828e" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.318005 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.399337 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8w7g"] Jan 30 08:42:49 crc kubenswrapper[4870]: E0130 08:42:49.399892 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f4b01-631a-4f4b-8ffb-f0873b819de0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.399913 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f4b01-631a-4f4b-8ffb-f0873b819de0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.400104 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f4b01-631a-4f4b-8ffb-f0873b819de0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.400819 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.403409 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.403609 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.403745 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.403866 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.408433 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8w7g"] Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.506404 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.506523 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd9zp\" (UniqueName: \"kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.507560 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.609060 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.609194 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.609291 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd9zp\" (UniqueName: \"kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.612656 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.614284 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.632856 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd9zp\" (UniqueName: \"kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp\") pod \"ssh-known-hosts-edpm-deployment-j8w7g\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:49 crc kubenswrapper[4870]: I0130 08:42:49.754158 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:50 crc kubenswrapper[4870]: I0130 08:42:50.425852 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j8w7g"] Jan 30 08:42:51 crc kubenswrapper[4870]: I0130 08:42:51.338266 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" event={"ID":"07db545c-df21-4f19-ad37-3071248b8672","Type":"ContainerStarted","Data":"99572818f1bad16bed8a0aa758dc16a90117312700653b7060fa549cf2294c0d"} Jan 30 08:42:51 crc kubenswrapper[4870]: I0130 08:42:51.338672 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" event={"ID":"07db545c-df21-4f19-ad37-3071248b8672","Type":"ContainerStarted","Data":"d0704d1a7324fb19affb05f629070ba3f4a83927946c3dbdffc8fda4be173d38"} Jan 30 08:42:55 crc kubenswrapper[4870]: I0130 08:42:55.249800 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:42:55 crc kubenswrapper[4870]: I0130 08:42:55.250117 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:42:58 crc kubenswrapper[4870]: I0130 08:42:58.395097 4870 generic.go:334] "Generic (PLEG): container finished" podID="07db545c-df21-4f19-ad37-3071248b8672" containerID="99572818f1bad16bed8a0aa758dc16a90117312700653b7060fa549cf2294c0d" exitCode=0 Jan 30 08:42:58 crc kubenswrapper[4870]: I0130 08:42:58.395148 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" event={"ID":"07db545c-df21-4f19-ad37-3071248b8672","Type":"ContainerDied","Data":"99572818f1bad16bed8a0aa758dc16a90117312700653b7060fa549cf2294c0d"} Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.894993 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.920578 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd9zp\" (UniqueName: \"kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp\") pod \"07db545c-df21-4f19-ad37-3071248b8672\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.920645 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0\") pod \"07db545c-df21-4f19-ad37-3071248b8672\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.920710 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam\") pod \"07db545c-df21-4f19-ad37-3071248b8672\" (UID: \"07db545c-df21-4f19-ad37-3071248b8672\") " Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.926292 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp" (OuterVolumeSpecName: "kube-api-access-kd9zp") pod "07db545c-df21-4f19-ad37-3071248b8672" (UID: "07db545c-df21-4f19-ad37-3071248b8672"). InnerVolumeSpecName "kube-api-access-kd9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.956073 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "07db545c-df21-4f19-ad37-3071248b8672" (UID: "07db545c-df21-4f19-ad37-3071248b8672"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:42:59 crc kubenswrapper[4870]: I0130 08:42:59.959714 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07db545c-df21-4f19-ad37-3071248b8672" (UID: "07db545c-df21-4f19-ad37-3071248b8672"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.023007 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd9zp\" (UniqueName: \"kubernetes.io/projected/07db545c-df21-4f19-ad37-3071248b8672-kube-api-access-kd9zp\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.023050 4870 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.023063 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07db545c-df21-4f19-ad37-3071248b8672-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.415548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" event={"ID":"07db545c-df21-4f19-ad37-3071248b8672","Type":"ContainerDied","Data":"d0704d1a7324fb19affb05f629070ba3f4a83927946c3dbdffc8fda4be173d38"} Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.415936 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0704d1a7324fb19affb05f629070ba3f4a83927946c3dbdffc8fda4be173d38" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.415592 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j8w7g" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.488662 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw"] Jan 30 08:43:00 crc kubenswrapper[4870]: E0130 08:43:00.489065 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07db545c-df21-4f19-ad37-3071248b8672" containerName="ssh-known-hosts-edpm-deployment" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.489084 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="07db545c-df21-4f19-ad37-3071248b8672" containerName="ssh-known-hosts-edpm-deployment" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.489290 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="07db545c-df21-4f19-ad37-3071248b8672" containerName="ssh-known-hosts-edpm-deployment" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.490055 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.492121 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.492747 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.492795 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.493251 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.504751 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw"] Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.534595 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.534656 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdkrm\" (UniqueName: \"kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.534725 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.636773 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.636817 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdkrm\" (UniqueName: \"kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.636907 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.641985 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.642155 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.655609 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdkrm\" (UniqueName: \"kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zl6bw\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:00 crc kubenswrapper[4870]: I0130 08:43:00.815010 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:01 crc kubenswrapper[4870]: I0130 08:43:01.412669 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw"] Jan 30 08:43:01 crc kubenswrapper[4870]: I0130 08:43:01.428910 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" event={"ID":"a685318c-e23f-4192-8ab4-7dbf24880b0d","Type":"ContainerStarted","Data":"98e9af6a5a9f939da73b19d3e83f7092c62a5c0fbe279d5298471495f178146d"} Jan 30 08:43:02 crc kubenswrapper[4870]: I0130 08:43:02.442030 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" event={"ID":"a685318c-e23f-4192-8ab4-7dbf24880b0d","Type":"ContainerStarted","Data":"41f27eabe1099445b59edd44241e4073e4d7dcaf32de5af8d9511c8bb8c53398"} Jan 30 08:43:02 crc kubenswrapper[4870]: I0130 08:43:02.470996 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" podStartSLOduration=1.9879866769999999 podStartE2EDuration="2.470966262s" podCreationTimestamp="2026-01-30 08:43:00 +0000 UTC" firstStartedPulling="2026-01-30 08:43:01.415606497 +0000 UTC m=+2020.111153606" lastFinishedPulling="2026-01-30 08:43:01.898586082 +0000 UTC m=+2020.594133191" observedRunningTime="2026-01-30 08:43:02.462384485 +0000 UTC m=+2021.157931584" watchObservedRunningTime="2026-01-30 08:43:02.470966262 +0000 UTC m=+2021.166513371" Jan 30 08:43:10 crc kubenswrapper[4870]: I0130 08:43:10.529669 4870 generic.go:334] "Generic (PLEG): container finished" podID="a685318c-e23f-4192-8ab4-7dbf24880b0d" containerID="41f27eabe1099445b59edd44241e4073e4d7dcaf32de5af8d9511c8bb8c53398" exitCode=0 Jan 30 08:43:10 crc kubenswrapper[4870]: I0130 08:43:10.529805 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" event={"ID":"a685318c-e23f-4192-8ab4-7dbf24880b0d","Type":"ContainerDied","Data":"41f27eabe1099445b59edd44241e4073e4d7dcaf32de5af8d9511c8bb8c53398"} Jan 30 08:43:11 crc kubenswrapper[4870]: I0130 08:43:11.954240 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.137858 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam\") pod \"a685318c-e23f-4192-8ab4-7dbf24880b0d\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.138072 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory\") pod \"a685318c-e23f-4192-8ab4-7dbf24880b0d\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.138119 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdkrm\" (UniqueName: \"kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm\") pod \"a685318c-e23f-4192-8ab4-7dbf24880b0d\" (UID: \"a685318c-e23f-4192-8ab4-7dbf24880b0d\") " Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.144323 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm" (OuterVolumeSpecName: "kube-api-access-gdkrm") pod "a685318c-e23f-4192-8ab4-7dbf24880b0d" (UID: "a685318c-e23f-4192-8ab4-7dbf24880b0d"). InnerVolumeSpecName "kube-api-access-gdkrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.173923 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a685318c-e23f-4192-8ab4-7dbf24880b0d" (UID: "a685318c-e23f-4192-8ab4-7dbf24880b0d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.180654 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory" (OuterVolumeSpecName: "inventory") pod "a685318c-e23f-4192-8ab4-7dbf24880b0d" (UID: "a685318c-e23f-4192-8ab4-7dbf24880b0d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.241058 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.241096 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdkrm\" (UniqueName: \"kubernetes.io/projected/a685318c-e23f-4192-8ab4-7dbf24880b0d-kube-api-access-gdkrm\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.241110 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a685318c-e23f-4192-8ab4-7dbf24880b0d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.550160 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" event={"ID":"a685318c-e23f-4192-8ab4-7dbf24880b0d","Type":"ContainerDied","Data":"98e9af6a5a9f939da73b19d3e83f7092c62a5c0fbe279d5298471495f178146d"} Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.550240 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98e9af6a5a9f939da73b19d3e83f7092c62a5c0fbe279d5298471495f178146d" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.550237 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zl6bw" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.622352 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29"] Jan 30 08:43:12 crc kubenswrapper[4870]: E0130 08:43:12.622841 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a685318c-e23f-4192-8ab4-7dbf24880b0d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.622868 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a685318c-e23f-4192-8ab4-7dbf24880b0d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.623144 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a685318c-e23f-4192-8ab4-7dbf24880b0d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.624053 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.625810 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.626658 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.626822 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.627007 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.639234 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29"] Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.649401 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qr9b\" (UniqueName: \"kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.649488 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.649541 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.750647 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qr9b\" (UniqueName: \"kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.750719 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.750781 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.755600 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.760422 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.776440 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qr9b\" (UniqueName: \"kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:12 crc kubenswrapper[4870]: I0130 08:43:12.941108 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:13 crc kubenswrapper[4870]: I0130 08:43:13.508821 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29"] Jan 30 08:43:13 crc kubenswrapper[4870]: I0130 08:43:13.559722 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" event={"ID":"7c9e0c7d-dc65-4862-99da-326bc8d45bfd","Type":"ContainerStarted","Data":"056c785611d96cce791fea952d02746ae7d8237c5a27e5a107a3d1799ffee03a"} Jan 30 08:43:14 crc kubenswrapper[4870]: I0130 08:43:14.570717 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" event={"ID":"7c9e0c7d-dc65-4862-99da-326bc8d45bfd","Type":"ContainerStarted","Data":"fb6da9f1b6e1c5ea9c983d97d60198441c391892546fcff50f89801de2f5a6bd"} Jan 30 08:43:14 crc kubenswrapper[4870]: I0130 08:43:14.592339 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" podStartSLOduration=2.12668546 podStartE2EDuration="2.592311815s" podCreationTimestamp="2026-01-30 08:43:12 +0000 UTC" firstStartedPulling="2026-01-30 08:43:13.503441966 +0000 UTC m=+2032.198989075" lastFinishedPulling="2026-01-30 08:43:13.969068321 +0000 UTC m=+2032.664615430" observedRunningTime="2026-01-30 08:43:14.584808812 +0000 UTC m=+2033.280355951" watchObservedRunningTime="2026-01-30 08:43:14.592311815 +0000 UTC m=+2033.287858924" Jan 30 08:43:23 crc kubenswrapper[4870]: I0130 08:43:23.646683 4870 generic.go:334] "Generic (PLEG): container finished" podID="7c9e0c7d-dc65-4862-99da-326bc8d45bfd" containerID="fb6da9f1b6e1c5ea9c983d97d60198441c391892546fcff50f89801de2f5a6bd" exitCode=0 Jan 30 08:43:23 crc kubenswrapper[4870]: I0130 08:43:23.646967 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" event={"ID":"7c9e0c7d-dc65-4862-99da-326bc8d45bfd","Type":"ContainerDied","Data":"fb6da9f1b6e1c5ea9c983d97d60198441c391892546fcff50f89801de2f5a6bd"} Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.067619 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.199547 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam\") pod \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.200112 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory\") pod \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.200158 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qr9b\" (UniqueName: \"kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b\") pod \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\" (UID: \"7c9e0c7d-dc65-4862-99da-326bc8d45bfd\") " Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.204402 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b" (OuterVolumeSpecName: "kube-api-access-5qr9b") pod "7c9e0c7d-dc65-4862-99da-326bc8d45bfd" (UID: "7c9e0c7d-dc65-4862-99da-326bc8d45bfd"). InnerVolumeSpecName "kube-api-access-5qr9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.226269 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c9e0c7d-dc65-4862-99da-326bc8d45bfd" (UID: "7c9e0c7d-dc65-4862-99da-326bc8d45bfd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.228541 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory" (OuterVolumeSpecName: "inventory") pod "7c9e0c7d-dc65-4862-99da-326bc8d45bfd" (UID: "7c9e0c7d-dc65-4862-99da-326bc8d45bfd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.249995 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.250051 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.302050 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.302090 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.302102 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qr9b\" (UniqueName: \"kubernetes.io/projected/7c9e0c7d-dc65-4862-99da-326bc8d45bfd-kube-api-access-5qr9b\") on node \"crc\" DevicePath \"\"" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.670051 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" event={"ID":"7c9e0c7d-dc65-4862-99da-326bc8d45bfd","Type":"ContainerDied","Data":"056c785611d96cce791fea952d02746ae7d8237c5a27e5a107a3d1799ffee03a"} Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.670096 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="056c785611d96cce791fea952d02746ae7d8237c5a27e5a107a3d1799ffee03a" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.670173 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.766660 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx"] Jan 30 08:43:25 crc kubenswrapper[4870]: E0130 08:43:25.767138 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9e0c7d-dc65-4862-99da-326bc8d45bfd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.767161 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9e0c7d-dc65-4862-99da-326bc8d45bfd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.767406 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9e0c7d-dc65-4862-99da-326bc8d45bfd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.768725 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774290 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774480 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774604 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774713 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774819 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.774938 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.775065 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.775512 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.811100 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx"] Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.911551 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.911893 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.911957 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912003 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrqd\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912057 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912162 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912350 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912426 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912481 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912633 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912691 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912750 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:25 crc kubenswrapper[4870]: I0130 08:43:25.912783 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015125 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrqd\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015257 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015346 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015447 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015534 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015612 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015670 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015797 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.015845 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.016015 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.016083 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.016154 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.016348 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.016393 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.021582 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.021604 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.023637 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.023967 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.024956 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.025270 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.025534 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.025797 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.025909 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.026322 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.028307 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.031075 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.034818 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrqd\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.036353 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.095046 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.626695 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx"] Jan 30 08:43:26 crc kubenswrapper[4870]: I0130 08:43:26.678548 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" event={"ID":"51d5d5e3-867b-4ec9-9fca-07038b83ba29","Type":"ContainerStarted","Data":"2e7af23946f70ff576e386febda735d9b078867d38c0a82ad1d3ba91aef60fca"} Jan 30 08:43:27 crc kubenswrapper[4870]: I0130 08:43:27.688635 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" event={"ID":"51d5d5e3-867b-4ec9-9fca-07038b83ba29","Type":"ContainerStarted","Data":"5f4fa977239c88cc1bd44a55cd7b7490e077c9fbc2dd85cfb9c83866198395b8"} Jan 30 08:43:27 crc kubenswrapper[4870]: I0130 08:43:27.712602 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" podStartSLOduration=2.249843844 podStartE2EDuration="2.712584809s" podCreationTimestamp="2026-01-30 08:43:25 +0000 UTC" firstStartedPulling="2026-01-30 08:43:26.619771486 +0000 UTC m=+2045.315318595" lastFinishedPulling="2026-01-30 08:43:27.082512451 +0000 UTC m=+2045.778059560" observedRunningTime="2026-01-30 08:43:27.709602766 +0000 UTC m=+2046.405149875" watchObservedRunningTime="2026-01-30 08:43:27.712584809 +0000 UTC m=+2046.408131918" Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.249034 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.249591 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.249752 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.250483 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.250544 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2" gracePeriod=600 Jan 30 08:43:55 crc kubenswrapper[4870]: E0130 08:43:55.360825 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3c8db6_cf22_4fb2_ae7c_a3d544473a6d.slice/crio-c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2.scope\": RecentStats: unable to find data in memory cache]" Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.974125 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2" exitCode=0 Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.974200 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2"} Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.974437 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac"} Jan 30 08:43:55 crc kubenswrapper[4870]: I0130 08:43:55.974460 4870 scope.go:117] "RemoveContainer" containerID="792f2902a1f9b1ca12c9f430b11c208dd4f06b6798f364f25a7a0c460c4f9a49" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.495601 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.498386 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.508509 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.655324 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.655442 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.655650 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48f8\" (UniqueName: \"kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.757507 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48f8\" (UniqueName: \"kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.757602 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.757672 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.758204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.758204 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.777502 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48f8\" (UniqueName: \"kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8\") pod \"redhat-operators-l2hdv\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:57 crc kubenswrapper[4870]: I0130 08:43:57.834387 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:43:58 crc kubenswrapper[4870]: I0130 08:43:58.332565 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:43:58 crc kubenswrapper[4870]: W0130 08:43:58.333239 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7015d647_81a4_406d_9ea9_50ba0f8376ba.slice/crio-5ac094d6e426443c5020bca9027144d8416ef01fe23ae02604a7c70ed2fa1c38 WatchSource:0}: Error finding container 5ac094d6e426443c5020bca9027144d8416ef01fe23ae02604a7c70ed2fa1c38: Status 404 returned error can't find the container with id 5ac094d6e426443c5020bca9027144d8416ef01fe23ae02604a7c70ed2fa1c38 Jan 30 08:43:59 crc kubenswrapper[4870]: I0130 08:43:59.009700 4870 generic.go:334] "Generic (PLEG): container finished" podID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerID="7ac3233c81049eeffa09b78437ea9a7c78a5ec459d0969d57b66799e4508c6f7" exitCode=0 Jan 30 08:43:59 crc kubenswrapper[4870]: I0130 08:43:59.010008 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerDied","Data":"7ac3233c81049eeffa09b78437ea9a7c78a5ec459d0969d57b66799e4508c6f7"} Jan 30 08:43:59 crc kubenswrapper[4870]: I0130 08:43:59.010043 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerStarted","Data":"5ac094d6e426443c5020bca9027144d8416ef01fe23ae02604a7c70ed2fa1c38"} Jan 30 08:44:00 crc kubenswrapper[4870]: I0130 08:44:00.022255 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerStarted","Data":"0b5963b6473dc23c8416476fa0419e10a0bd0cb40380f00a8cfac8f3dbeeeadd"} Jan 30 08:44:01 crc kubenswrapper[4870]: I0130 08:44:01.033952 4870 generic.go:334] "Generic (PLEG): container finished" podID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerID="0b5963b6473dc23c8416476fa0419e10a0bd0cb40380f00a8cfac8f3dbeeeadd" exitCode=0 Jan 30 08:44:01 crc kubenswrapper[4870]: I0130 08:44:01.034062 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerDied","Data":"0b5963b6473dc23c8416476fa0419e10a0bd0cb40380f00a8cfac8f3dbeeeadd"} Jan 30 08:44:02 crc kubenswrapper[4870]: I0130 08:44:02.049375 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerStarted","Data":"393953fb27b08df6d340759380c688ecd76bec4ed10755ff84a58927ae244eb1"} Jan 30 08:44:02 crc kubenswrapper[4870]: I0130 08:44:02.072666 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2hdv" podStartSLOduration=2.288960599 podStartE2EDuration="5.072647053s" podCreationTimestamp="2026-01-30 08:43:57 +0000 UTC" firstStartedPulling="2026-01-30 08:43:59.012445395 +0000 UTC m=+2077.707992504" lastFinishedPulling="2026-01-30 08:44:01.796131839 +0000 UTC m=+2080.491678958" observedRunningTime="2026-01-30 08:44:02.070558488 +0000 UTC m=+2080.766105597" watchObservedRunningTime="2026-01-30 08:44:02.072647053 +0000 UTC m=+2080.768194162" Jan 30 08:44:04 crc kubenswrapper[4870]: I0130 08:44:04.086015 4870 generic.go:334] "Generic (PLEG): container finished" podID="51d5d5e3-867b-4ec9-9fca-07038b83ba29" containerID="5f4fa977239c88cc1bd44a55cd7b7490e077c9fbc2dd85cfb9c83866198395b8" exitCode=0 Jan 30 08:44:04 crc kubenswrapper[4870]: I0130 08:44:04.104236 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" event={"ID":"51d5d5e3-867b-4ec9-9fca-07038b83ba29","Type":"ContainerDied","Data":"5f4fa977239c88cc1bd44a55cd7b7490e077c9fbc2dd85cfb9c83866198395b8"} Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.540063 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731460 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731540 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731626 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731671 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731760 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731812 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrqd\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731900 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731959 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.731986 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.732019 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.732048 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.732076 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.732110 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.732137 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\" (UID: \"51d5d5e3-867b-4ec9-9fca-07038b83ba29\") " Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.740649 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.740690 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.740966 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.741124 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.741274 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.746117 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.746544 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.746631 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.746954 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.748352 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd" (OuterVolumeSpecName: "kube-api-access-glrqd") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "kube-api-access-glrqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.750527 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.753701 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.779927 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.781470 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory" (OuterVolumeSpecName: "inventory") pod "51d5d5e3-867b-4ec9-9fca-07038b83ba29" (UID: "51d5d5e3-867b-4ec9-9fca-07038b83ba29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835402 4870 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835440 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrqd\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-kube-api-access-glrqd\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835450 4870 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835458 4870 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835471 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835483 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835497 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835509 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835523 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835533 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835543 4870 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835552 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/51d5d5e3-867b-4ec9-9fca-07038b83ba29-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835560 4870 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:05 crc kubenswrapper[4870]: I0130 08:44:05.835570 4870 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d5d5e3-867b-4ec9-9fca-07038b83ba29-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.104719 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" event={"ID":"51d5d5e3-867b-4ec9-9fca-07038b83ba29","Type":"ContainerDied","Data":"2e7af23946f70ff576e386febda735d9b078867d38c0a82ad1d3ba91aef60fca"} Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.104770 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e7af23946f70ff576e386febda735d9b078867d38c0a82ad1d3ba91aef60fca" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.104810 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.223238 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z"] Jan 30 08:44:06 crc kubenswrapper[4870]: E0130 08:44:06.223701 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d5d5e3-867b-4ec9-9fca-07038b83ba29" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.223725 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d5d5e3-867b-4ec9-9fca-07038b83ba29" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.223937 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d5d5e3-867b-4ec9-9fca-07038b83ba29" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.224602 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.226786 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.226799 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.226975 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.227015 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.228470 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.236768 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z"] Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.345598 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.345695 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.345726 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.345783 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.345934 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvcq\" (UniqueName: \"kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.447953 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.448023 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvcq\" (UniqueName: \"kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.448137 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.448190 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.448235 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.449203 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.451978 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.452271 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.453269 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.465828 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvcq\" (UniqueName: \"kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-8z72z\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:06 crc kubenswrapper[4870]: I0130 08:44:06.549636 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:44:07 crc kubenswrapper[4870]: I0130 08:44:07.131555 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z"] Jan 30 08:44:07 crc kubenswrapper[4870]: I0130 08:44:07.837051 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:07 crc kubenswrapper[4870]: I0130 08:44:07.839251 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:07 crc kubenswrapper[4870]: I0130 08:44:07.892392 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:08 crc kubenswrapper[4870]: I0130 08:44:08.122758 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" event={"ID":"11f380d9-7c41-4b65-a46d-01c14ac81c07","Type":"ContainerStarted","Data":"a16d38c22e97a59730cb881ed767b3b67bb0324d1db69e50abec5c00f259b66e"} Jan 30 08:44:08 crc kubenswrapper[4870]: I0130 08:44:08.173649 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:08 crc kubenswrapper[4870]: I0130 08:44:08.226081 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:44:09 crc kubenswrapper[4870]: I0130 08:44:09.133276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" event={"ID":"11f380d9-7c41-4b65-a46d-01c14ac81c07","Type":"ContainerStarted","Data":"aa4d81ec188518a93eca2918c2b6aef7524e7ef50e3d95e59184540a039c7929"} Jan 30 08:44:09 crc kubenswrapper[4870]: I0130 08:44:09.153046 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" podStartSLOduration=2.340435801 podStartE2EDuration="3.153028434s" podCreationTimestamp="2026-01-30 08:44:06 +0000 UTC" firstStartedPulling="2026-01-30 08:44:07.136363643 +0000 UTC m=+2085.831910752" lastFinishedPulling="2026-01-30 08:44:07.948956276 +0000 UTC m=+2086.644503385" observedRunningTime="2026-01-30 08:44:09.149389941 +0000 UTC m=+2087.844937050" watchObservedRunningTime="2026-01-30 08:44:09.153028434 +0000 UTC m=+2087.848575543" Jan 30 08:44:10 crc kubenswrapper[4870]: I0130 08:44:10.141178 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2hdv" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="registry-server" containerID="cri-o://393953fb27b08df6d340759380c688ecd76bec4ed10755ff84a58927ae244eb1" gracePeriod=2 Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.152249 4870 generic.go:334] "Generic (PLEG): container finished" podID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerID="393953fb27b08df6d340759380c688ecd76bec4ed10755ff84a58927ae244eb1" exitCode=0 Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.152332 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerDied","Data":"393953fb27b08df6d340759380c688ecd76bec4ed10755ff84a58927ae244eb1"} Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.689015 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.889332 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content\") pod \"7015d647-81a4-406d-9ea9-50ba0f8376ba\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.889564 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48f8\" (UniqueName: \"kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8\") pod \"7015d647-81a4-406d-9ea9-50ba0f8376ba\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.889625 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities\") pod \"7015d647-81a4-406d-9ea9-50ba0f8376ba\" (UID: \"7015d647-81a4-406d-9ea9-50ba0f8376ba\") " Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.891001 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities" (OuterVolumeSpecName: "utilities") pod "7015d647-81a4-406d-9ea9-50ba0f8376ba" (UID: "7015d647-81a4-406d-9ea9-50ba0f8376ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.896152 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8" (OuterVolumeSpecName: "kube-api-access-x48f8") pod "7015d647-81a4-406d-9ea9-50ba0f8376ba" (UID: "7015d647-81a4-406d-9ea9-50ba0f8376ba"). InnerVolumeSpecName "kube-api-access-x48f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.992814 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48f8\" (UniqueName: \"kubernetes.io/projected/7015d647-81a4-406d-9ea9-50ba0f8376ba-kube-api-access-x48f8\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:11 crc kubenswrapper[4870]: I0130 08:44:11.992856 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.010779 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7015d647-81a4-406d-9ea9-50ba0f8376ba" (UID: "7015d647-81a4-406d-9ea9-50ba0f8376ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.093769 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7015d647-81a4-406d-9ea9-50ba0f8376ba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.170170 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2hdv" event={"ID":"7015d647-81a4-406d-9ea9-50ba0f8376ba","Type":"ContainerDied","Data":"5ac094d6e426443c5020bca9027144d8416ef01fe23ae02604a7c70ed2fa1c38"} Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.170247 4870 scope.go:117] "RemoveContainer" containerID="393953fb27b08df6d340759380c688ecd76bec4ed10755ff84a58927ae244eb1" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.170264 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2hdv" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.193633 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.202027 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2hdv"] Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.204709 4870 scope.go:117] "RemoveContainer" containerID="0b5963b6473dc23c8416476fa0419e10a0bd0cb40380f00a8cfac8f3dbeeeadd" Jan 30 08:44:12 crc kubenswrapper[4870]: I0130 08:44:12.229772 4870 scope.go:117] "RemoveContainer" containerID="7ac3233c81049eeffa09b78437ea9a7c78a5ec459d0969d57b66799e4508c6f7" Jan 30 08:44:14 crc kubenswrapper[4870]: I0130 08:44:14.086334 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" path="/var/lib/kubelet/pods/7015d647-81a4-406d-9ea9-50ba0f8376ba/volumes" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.152717 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj"] Jan 30 08:45:00 crc kubenswrapper[4870]: E0130 08:45:00.154379 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="extract-utilities" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.154406 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="extract-utilities" Jan 30 08:45:00 crc kubenswrapper[4870]: E0130 08:45:00.154459 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="extract-content" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.154466 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="extract-content" Jan 30 08:45:00 crc kubenswrapper[4870]: E0130 08:45:00.154479 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="registry-server" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.154485 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="registry-server" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.154744 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7015d647-81a4-406d-9ea9-50ba0f8376ba" containerName="registry-server" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.155858 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.159510 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.159764 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.172828 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj"] Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.333802 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjts2\" (UniqueName: \"kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.334252 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.334393 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.436869 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.436995 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjts2\" (UniqueName: \"kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.437097 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.438259 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.449178 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.458675 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjts2\" (UniqueName: \"kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2\") pod \"collect-profiles-29496045-wr5sj\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.487052 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:00 crc kubenswrapper[4870]: I0130 08:45:00.947706 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj"] Jan 30 08:45:01 crc kubenswrapper[4870]: I0130 08:45:01.676419 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" event={"ID":"e9c91153-7a90-4c60-811f-915f8ccf0bdf","Type":"ContainerStarted","Data":"479ba30159faf1bd5abe17d0fd8bcbe0c86c787b6f5f69ef68ac1e6330cdb3a2"} Jan 30 08:45:01 crc kubenswrapper[4870]: I0130 08:45:01.676707 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" event={"ID":"e9c91153-7a90-4c60-811f-915f8ccf0bdf","Type":"ContainerStarted","Data":"aa224fd3965f88a5a13b568410ba8bb45dead9a3a435594797d76327580780a2"} Jan 30 08:45:01 crc kubenswrapper[4870]: I0130 08:45:01.702246 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" podStartSLOduration=1.702221442 podStartE2EDuration="1.702221442s" podCreationTimestamp="2026-01-30 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:45:01.692791208 +0000 UTC m=+2140.388338327" watchObservedRunningTime="2026-01-30 08:45:01.702221442 +0000 UTC m=+2140.397768561" Jan 30 08:45:02 crc kubenswrapper[4870]: I0130 08:45:02.686229 4870 generic.go:334] "Generic (PLEG): container finished" podID="e9c91153-7a90-4c60-811f-915f8ccf0bdf" containerID="479ba30159faf1bd5abe17d0fd8bcbe0c86c787b6f5f69ef68ac1e6330cdb3a2" exitCode=0 Jan 30 08:45:02 crc kubenswrapper[4870]: I0130 08:45:02.686285 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" event={"ID":"e9c91153-7a90-4c60-811f-915f8ccf0bdf","Type":"ContainerDied","Data":"479ba30159faf1bd5abe17d0fd8bcbe0c86c787b6f5f69ef68ac1e6330cdb3a2"} Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.034114 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.219030 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjts2\" (UniqueName: \"kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2\") pod \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.219267 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume\") pod \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.219462 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume\") pod \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\" (UID: \"e9c91153-7a90-4c60-811f-915f8ccf0bdf\") " Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.220369 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9c91153-7a90-4c60-811f-915f8ccf0bdf" (UID: "e9c91153-7a90-4c60-811f-915f8ccf0bdf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.223102 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c91153-7a90-4c60-811f-915f8ccf0bdf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.226699 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2" (OuterVolumeSpecName: "kube-api-access-qjts2") pod "e9c91153-7a90-4c60-811f-915f8ccf0bdf" (UID: "e9c91153-7a90-4c60-811f-915f8ccf0bdf"). InnerVolumeSpecName "kube-api-access-qjts2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.226750 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e9c91153-7a90-4c60-811f-915f8ccf0bdf" (UID: "e9c91153-7a90-4c60-811f-915f8ccf0bdf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.325814 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjts2\" (UniqueName: \"kubernetes.io/projected/e9c91153-7a90-4c60-811f-915f8ccf0bdf-kube-api-access-qjts2\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.325847 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9c91153-7a90-4c60-811f-915f8ccf0bdf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.704429 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" event={"ID":"e9c91153-7a90-4c60-811f-915f8ccf0bdf","Type":"ContainerDied","Data":"aa224fd3965f88a5a13b568410ba8bb45dead9a3a435594797d76327580780a2"} Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.704475 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa224fd3965f88a5a13b568410ba8bb45dead9a3a435594797d76327580780a2" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.704488 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj" Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.774453 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92"] Jan 30 08:45:04 crc kubenswrapper[4870]: I0130 08:45:04.795989 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496000-25p92"] Jan 30 08:45:06 crc kubenswrapper[4870]: I0130 08:45:06.087705 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fd6b37-eee2-4fd5-aa18-51eecea65a3b" path="/var/lib/kubelet/pods/93fd6b37-eee2-4fd5-aa18-51eecea65a3b/volumes" Jan 30 08:45:12 crc kubenswrapper[4870]: I0130 08:45:12.780964 4870 generic.go:334] "Generic (PLEG): container finished" podID="11f380d9-7c41-4b65-a46d-01c14ac81c07" containerID="aa4d81ec188518a93eca2918c2b6aef7524e7ef50e3d95e59184540a039c7929" exitCode=0 Jan 30 08:45:12 crc kubenswrapper[4870]: I0130 08:45:12.781540 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" event={"ID":"11f380d9-7c41-4b65-a46d-01c14ac81c07","Type":"ContainerDied","Data":"aa4d81ec188518a93eca2918c2b6aef7524e7ef50e3d95e59184540a039c7929"} Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.231362 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.327987 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle\") pod \"11f380d9-7c41-4b65-a46d-01c14ac81c07\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.328042 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0\") pod \"11f380d9-7c41-4b65-a46d-01c14ac81c07\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.328268 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory\") pod \"11f380d9-7c41-4b65-a46d-01c14ac81c07\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.328290 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam\") pod \"11f380d9-7c41-4b65-a46d-01c14ac81c07\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.328307 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlvcq\" (UniqueName: \"kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq\") pod \"11f380d9-7c41-4b65-a46d-01c14ac81c07\" (UID: \"11f380d9-7c41-4b65-a46d-01c14ac81c07\") " Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.333396 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "11f380d9-7c41-4b65-a46d-01c14ac81c07" (UID: "11f380d9-7c41-4b65-a46d-01c14ac81c07"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.333594 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq" (OuterVolumeSpecName: "kube-api-access-xlvcq") pod "11f380d9-7c41-4b65-a46d-01c14ac81c07" (UID: "11f380d9-7c41-4b65-a46d-01c14ac81c07"). InnerVolumeSpecName "kube-api-access-xlvcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.352400 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "11f380d9-7c41-4b65-a46d-01c14ac81c07" (UID: "11f380d9-7c41-4b65-a46d-01c14ac81c07"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.355022 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory" (OuterVolumeSpecName: "inventory") pod "11f380d9-7c41-4b65-a46d-01c14ac81c07" (UID: "11f380d9-7c41-4b65-a46d-01c14ac81c07"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.357231 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "11f380d9-7c41-4b65-a46d-01c14ac81c07" (UID: "11f380d9-7c41-4b65-a46d-01c14ac81c07"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.430458 4870 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.430498 4870 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/11f380d9-7c41-4b65-a46d-01c14ac81c07-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.430508 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.430518 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11f380d9-7c41-4b65-a46d-01c14ac81c07-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.430531 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlvcq\" (UniqueName: \"kubernetes.io/projected/11f380d9-7c41-4b65-a46d-01c14ac81c07-kube-api-access-xlvcq\") on node \"crc\" DevicePath \"\"" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.800034 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" event={"ID":"11f380d9-7c41-4b65-a46d-01c14ac81c07","Type":"ContainerDied","Data":"a16d38c22e97a59730cb881ed767b3b67bb0324d1db69e50abec5c00f259b66e"} Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.800070 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a16d38c22e97a59730cb881ed767b3b67bb0324d1db69e50abec5c00f259b66e" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.800087 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-8z72z" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.901709 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8"] Jan 30 08:45:14 crc kubenswrapper[4870]: E0130 08:45:14.902313 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f380d9-7c41-4b65-a46d-01c14ac81c07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.902336 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f380d9-7c41-4b65-a46d-01c14ac81c07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 08:45:14 crc kubenswrapper[4870]: E0130 08:45:14.902368 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c91153-7a90-4c60-811f-915f8ccf0bdf" containerName="collect-profiles" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.902381 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c91153-7a90-4c60-811f-915f8ccf0bdf" containerName="collect-profiles" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.902607 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c91153-7a90-4c60-811f-915f8ccf0bdf" containerName="collect-profiles" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.902656 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f380d9-7c41-4b65-a46d-01c14ac81c07" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.903540 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.906504 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.906735 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.907006 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.907573 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.908759 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.908856 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:45:14 crc kubenswrapper[4870]: I0130 08:45:14.912279 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8"] Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.041977 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.042060 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.042095 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.042146 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.042362 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwcv\" (UniqueName: \"kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.042408 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.144270 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.145467 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.145629 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.145862 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwcv\" (UniqueName: \"kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.146064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.146199 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.150594 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.150705 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.151121 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.152507 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.152529 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.166161 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwcv\" (UniqueName: \"kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.235587 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.731670 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8"] Jan 30 08:45:15 crc kubenswrapper[4870]: I0130 08:45:15.811848 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" event={"ID":"bbcba502-7991-4f7b-bdbd-b112cec436b9","Type":"ContainerStarted","Data":"322fb62ba42e648747673d353c56acbccd668b8ace4d3ac0389c386c755de181"} Jan 30 08:45:16 crc kubenswrapper[4870]: I0130 08:45:16.835666 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" event={"ID":"bbcba502-7991-4f7b-bdbd-b112cec436b9","Type":"ContainerStarted","Data":"bee60a9b8d5f543e434738aa5f0e9131d5a086cce903cbf89bbe4f59ccb94b7e"} Jan 30 08:45:16 crc kubenswrapper[4870]: I0130 08:45:16.857987 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" podStartSLOduration=2.375650216 podStartE2EDuration="2.85796892s" podCreationTimestamp="2026-01-30 08:45:14 +0000 UTC" firstStartedPulling="2026-01-30 08:45:15.734761642 +0000 UTC m=+2154.430308751" lastFinishedPulling="2026-01-30 08:45:16.217080326 +0000 UTC m=+2154.912627455" observedRunningTime="2026-01-30 08:45:16.85346494 +0000 UTC m=+2155.549012059" watchObservedRunningTime="2026-01-30 08:45:16.85796892 +0000 UTC m=+2155.553516029" Jan 30 08:45:37 crc kubenswrapper[4870]: I0130 08:45:37.829380 4870 scope.go:117] "RemoveContainer" containerID="2065e95b92696a8bb664d6087b11271d4f8873eafbf3cee077ccf40c2dbf8d79" Jan 30 08:45:55 crc kubenswrapper[4870]: I0130 08:45:55.249072 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:45:55 crc kubenswrapper[4870]: I0130 08:45:55.249595 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:46:04 crc kubenswrapper[4870]: I0130 08:46:04.950638 4870 generic.go:334] "Generic (PLEG): container finished" podID="bbcba502-7991-4f7b-bdbd-b112cec436b9" containerID="bee60a9b8d5f543e434738aa5f0e9131d5a086cce903cbf89bbe4f59ccb94b7e" exitCode=0 Jan 30 08:46:04 crc kubenswrapper[4870]: I0130 08:46:04.950719 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" event={"ID":"bbcba502-7991-4f7b-bdbd-b112cec436b9","Type":"ContainerDied","Data":"bee60a9b8d5f543e434738aa5f0e9131d5a086cce903cbf89bbe4f59ccb94b7e"} Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.322330 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.330708 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.345555 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.426634 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.426751 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lscq\" (UniqueName: \"kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.426789 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.528663 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.528773 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lscq\" (UniqueName: \"kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.528802 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.529267 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.529444 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.551004 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lscq\" (UniqueName: \"kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq\") pod \"community-operators-k2s7g\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:05 crc kubenswrapper[4870]: I0130 08:46:05.662172 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.283702 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.433818 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.557563 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.557669 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.557698 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcwcv\" (UniqueName: \"kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.557921 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.557946 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.558009 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bbcba502-7991-4f7b-bdbd-b112cec436b9\" (UID: \"bbcba502-7991-4f7b-bdbd-b112cec436b9\") " Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.572066 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.583253 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv" (OuterVolumeSpecName: "kube-api-access-mcwcv") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "kube-api-access-mcwcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.620010 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.657088 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.661810 4870 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.661864 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcwcv\" (UniqueName: \"kubernetes.io/projected/bbcba502-7991-4f7b-bdbd-b112cec436b9-kube-api-access-mcwcv\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.661902 4870 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.661918 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.689072 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory" (OuterVolumeSpecName: "inventory") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.694764 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bbcba502-7991-4f7b-bdbd-b112cec436b9" (UID: "bbcba502-7991-4f7b-bdbd-b112cec436b9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.763281 4870 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.763589 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcba502-7991-4f7b-bdbd-b112cec436b9-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.981160 4870 generic.go:334] "Generic (PLEG): container finished" podID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerID="1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8" exitCode=0 Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.981237 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerDied","Data":"1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8"} Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.981276 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerStarted","Data":"15c6cbff9549dfa3cb8b176dfb869490faf191326af5b9742ef3e438454b1062"} Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.983236 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" event={"ID":"bbcba502-7991-4f7b-bdbd-b112cec436b9","Type":"ContainerDied","Data":"322fb62ba42e648747673d353c56acbccd668b8ace4d3ac0389c386c755de181"} Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.983262 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322fb62ba42e648747673d353c56acbccd668b8ace4d3ac0389c386c755de181" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.983318 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8" Jan 30 08:46:06 crc kubenswrapper[4870]: I0130 08:46:06.986486 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.068815 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr"] Jan 30 08:46:07 crc kubenswrapper[4870]: E0130 08:46:07.069417 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcba502-7991-4f7b-bdbd-b112cec436b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.069442 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcba502-7991-4f7b-bdbd-b112cec436b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.069699 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbcba502-7991-4f7b-bdbd-b112cec436b9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.070648 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078087 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr"] Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078334 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078399 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078525 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078628 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.078768 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.178191 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzbr\" (UniqueName: \"kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.178508 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.178587 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.178835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.179001 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.280475 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.280575 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.280657 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.280709 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.280759 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzbr\" (UniqueName: \"kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.288419 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.288479 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.288592 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.289039 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.308274 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzbr\" (UniqueName: \"kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-26lfr\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.394054 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:46:07 crc kubenswrapper[4870]: I0130 08:46:07.940998 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr"] Jan 30 08:46:07 crc kubenswrapper[4870]: W0130 08:46:07.947414 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e214e41_a575_467c_a053_d6807c4f1512.slice/crio-958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c WatchSource:0}: Error finding container 958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c: Status 404 returned error can't find the container with id 958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c Jan 30 08:46:08 crc kubenswrapper[4870]: I0130 08:46:08.011813 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerStarted","Data":"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37"} Jan 30 08:46:08 crc kubenswrapper[4870]: I0130 08:46:08.013223 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" event={"ID":"9e214e41-a575-467c-a053-d6807c4f1512","Type":"ContainerStarted","Data":"958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c"} Jan 30 08:46:09 crc kubenswrapper[4870]: I0130 08:46:09.025345 4870 generic.go:334] "Generic (PLEG): container finished" podID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerID="ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37" exitCode=0 Jan 30 08:46:09 crc kubenswrapper[4870]: I0130 08:46:09.025401 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerDied","Data":"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37"} Jan 30 08:46:09 crc kubenswrapper[4870]: I0130 08:46:09.028012 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" event={"ID":"9e214e41-a575-467c-a053-d6807c4f1512","Type":"ContainerStarted","Data":"849cf3c1bde9a8f54182673096d806dd29610e89f52d8b6fde1f213230d3c284"} Jan 30 08:46:09 crc kubenswrapper[4870]: I0130 08:46:09.064762 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" podStartSLOduration=1.558669259 podStartE2EDuration="2.064739244s" podCreationTimestamp="2026-01-30 08:46:07 +0000 UTC" firstStartedPulling="2026-01-30 08:46:07.952237688 +0000 UTC m=+2206.647784797" lastFinishedPulling="2026-01-30 08:46:08.458307673 +0000 UTC m=+2207.153854782" observedRunningTime="2026-01-30 08:46:09.061849904 +0000 UTC m=+2207.757397023" watchObservedRunningTime="2026-01-30 08:46:09.064739244 +0000 UTC m=+2207.760286363" Jan 30 08:46:10 crc kubenswrapper[4870]: I0130 08:46:10.039008 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerStarted","Data":"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1"} Jan 30 08:46:10 crc kubenswrapper[4870]: I0130 08:46:10.065539 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2s7g" podStartSLOduration=2.53888043 podStartE2EDuration="5.06552182s" podCreationTimestamp="2026-01-30 08:46:05 +0000 UTC" firstStartedPulling="2026-01-30 08:46:06.986067614 +0000 UTC m=+2205.681614723" lastFinishedPulling="2026-01-30 08:46:09.512708984 +0000 UTC m=+2208.208256113" observedRunningTime="2026-01-30 08:46:10.056679054 +0000 UTC m=+2208.752226193" watchObservedRunningTime="2026-01-30 08:46:10.06552182 +0000 UTC m=+2208.761068919" Jan 30 08:46:15 crc kubenswrapper[4870]: I0130 08:46:15.662955 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:15 crc kubenswrapper[4870]: I0130 08:46:15.663317 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:15 crc kubenswrapper[4870]: I0130 08:46:15.729569 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:16 crc kubenswrapper[4870]: I0130 08:46:16.141533 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:16 crc kubenswrapper[4870]: I0130 08:46:16.195063 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.108439 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k2s7g" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="registry-server" containerID="cri-o://f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1" gracePeriod=2 Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.585069 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.725136 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities\") pod \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.725232 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lscq\" (UniqueName: \"kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq\") pod \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.725297 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content\") pod \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\" (UID: \"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1\") " Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.726458 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities" (OuterVolumeSpecName: "utilities") pod "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" (UID: "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.784086 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq" (OuterVolumeSpecName: "kube-api-access-5lscq") pod "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" (UID: "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1"). InnerVolumeSpecName "kube-api-access-5lscq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.800368 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" (UID: "5befb693-ddf7-4fe9-b77a-1ec961e1b2f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.827453 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.827483 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lscq\" (UniqueName: \"kubernetes.io/projected/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-kube-api-access-5lscq\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:18 crc kubenswrapper[4870]: I0130 08:46:18.827494 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.122278 4870 generic.go:334] "Generic (PLEG): container finished" podID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerID="f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1" exitCode=0 Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.122356 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2s7g" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.122379 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerDied","Data":"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1"} Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.122930 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2s7g" event={"ID":"5befb693-ddf7-4fe9-b77a-1ec961e1b2f1","Type":"ContainerDied","Data":"15c6cbff9549dfa3cb8b176dfb869490faf191326af5b9742ef3e438454b1062"} Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.122969 4870 scope.go:117] "RemoveContainer" containerID="f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.153836 4870 scope.go:117] "RemoveContainer" containerID="ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.179333 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.191763 4870 scope.go:117] "RemoveContainer" containerID="1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.195645 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k2s7g"] Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.263503 4870 scope.go:117] "RemoveContainer" containerID="f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1" Jan 30 08:46:19 crc kubenswrapper[4870]: E0130 08:46:19.263832 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1\": container with ID starting with f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1 not found: ID does not exist" containerID="f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.263861 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1"} err="failed to get container status \"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1\": rpc error: code = NotFound desc = could not find container \"f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1\": container with ID starting with f106933e145a67e8999b56022a3749d70fbe3673069ce5dcb65b1319a3995fe1 not found: ID does not exist" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.263902 4870 scope.go:117] "RemoveContainer" containerID="ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37" Jan 30 08:46:19 crc kubenswrapper[4870]: E0130 08:46:19.264367 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37\": container with ID starting with ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37 not found: ID does not exist" containerID="ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.264387 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37"} err="failed to get container status \"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37\": rpc error: code = NotFound desc = could not find container \"ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37\": container with ID starting with ac1d83add4726895af17972200c5228205b0b5c89bfd1defebc40a14e95dbe37 not found: ID does not exist" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.264400 4870 scope.go:117] "RemoveContainer" containerID="1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8" Jan 30 08:46:19 crc kubenswrapper[4870]: E0130 08:46:19.264578 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8\": container with ID starting with 1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8 not found: ID does not exist" containerID="1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8" Jan 30 08:46:19 crc kubenswrapper[4870]: I0130 08:46:19.264594 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8"} err="failed to get container status \"1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8\": rpc error: code = NotFound desc = could not find container \"1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8\": container with ID starting with 1fa08930f61eeae7ff87f60ec6402e00026918ea40b0836370f9478928ccd5d8 not found: ID does not exist" Jan 30 08:46:20 crc kubenswrapper[4870]: I0130 08:46:20.085310 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" path="/var/lib/kubelet/pods/5befb693-ddf7-4fe9-b77a-1ec961e1b2f1/volumes" Jan 30 08:46:25 crc kubenswrapper[4870]: I0130 08:46:25.250265 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:46:25 crc kubenswrapper[4870]: I0130 08:46:25.250673 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.249870 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.250400 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.250455 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.251272 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.251335 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" gracePeriod=600 Jan 30 08:46:55 crc kubenswrapper[4870]: E0130 08:46:55.425098 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.457063 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" exitCode=0 Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.457369 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac"} Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.457528 4870 scope.go:117] "RemoveContainer" containerID="c9fb81e78dfc8d967fae5bac7b245ddd0a04c8e07a775324570150884f7934d2" Jan 30 08:46:55 crc kubenswrapper[4870]: I0130 08:46:55.458131 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:46:55 crc kubenswrapper[4870]: E0130 08:46:55.458366 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.103285 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:46:57 crc kubenswrapper[4870]: E0130 08:46:57.104050 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="extract-utilities" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.104069 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="extract-utilities" Jan 30 08:46:57 crc kubenswrapper[4870]: E0130 08:46:57.104111 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="registry-server" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.104120 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="registry-server" Jan 30 08:46:57 crc kubenswrapper[4870]: E0130 08:46:57.104135 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="extract-content" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.104142 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="extract-content" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.104423 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5befb693-ddf7-4fe9-b77a-1ec961e1b2f1" containerName="registry-server" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.106205 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.114864 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.260386 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.260453 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.260556 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtf45\" (UniqueName: \"kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.363109 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.363207 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.363726 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.364004 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.364203 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtf45\" (UniqueName: \"kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.388524 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtf45\" (UniqueName: \"kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45\") pod \"certified-operators-trmvn\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.427037 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:46:57 crc kubenswrapper[4870]: I0130 08:46:57.771955 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:46:58 crc kubenswrapper[4870]: I0130 08:46:58.488641 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerID="fb63265b6df908944c072b8b056481ba1574a9802aca45041ac50a3b50cbe3ff" exitCode=0 Jan 30 08:46:58 crc kubenswrapper[4870]: I0130 08:46:58.488711 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerDied","Data":"fb63265b6df908944c072b8b056481ba1574a9802aca45041ac50a3b50cbe3ff"} Jan 30 08:46:58 crc kubenswrapper[4870]: I0130 08:46:58.489097 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerStarted","Data":"d437a8dec9fb30f00a4e34ade8f87279693722da031abd27b5c05ab979ca074a"} Jan 30 08:47:00 crc kubenswrapper[4870]: I0130 08:47:00.509603 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerID="d0054c95b532e36424c3acf46688e0ebfdf292697edc56afd3f30e979d09653c" exitCode=0 Jan 30 08:47:00 crc kubenswrapper[4870]: I0130 08:47:00.509670 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerDied","Data":"d0054c95b532e36424c3acf46688e0ebfdf292697edc56afd3f30e979d09653c"} Jan 30 08:47:01 crc kubenswrapper[4870]: I0130 08:47:01.520766 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerStarted","Data":"d36182e2b07c4b05bbda48d653498be4113ed91928168f19497faa1185fccd8f"} Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.075394 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:47:07 crc kubenswrapper[4870]: E0130 08:47:07.076271 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.427935 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.428066 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.479921 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.507343 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-trmvn" podStartSLOduration=8.043887033 podStartE2EDuration="10.507318908s" podCreationTimestamp="2026-01-30 08:46:57 +0000 UTC" firstStartedPulling="2026-01-30 08:46:58.490626436 +0000 UTC m=+2257.186173555" lastFinishedPulling="2026-01-30 08:47:00.954058321 +0000 UTC m=+2259.649605430" observedRunningTime="2026-01-30 08:47:01.54151249 +0000 UTC m=+2260.237059609" watchObservedRunningTime="2026-01-30 08:47:07.507318908 +0000 UTC m=+2266.202866027" Jan 30 08:47:07 crc kubenswrapper[4870]: I0130 08:47:07.630588 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:09 crc kubenswrapper[4870]: I0130 08:47:09.892319 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:47:10 crc kubenswrapper[4870]: I0130 08:47:10.615278 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-trmvn" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="registry-server" containerID="cri-o://d36182e2b07c4b05bbda48d653498be4113ed91928168f19497faa1185fccd8f" gracePeriod=2 Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.627074 4870 generic.go:334] "Generic (PLEG): container finished" podID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerID="d36182e2b07c4b05bbda48d653498be4113ed91928168f19497faa1185fccd8f" exitCode=0 Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.627146 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerDied","Data":"d36182e2b07c4b05bbda48d653498be4113ed91928168f19497faa1185fccd8f"} Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.760220 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.862736 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content\") pod \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.862838 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities\") pod \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.862948 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtf45\" (UniqueName: \"kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45\") pod \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\" (UID: \"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a\") " Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.863973 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities" (OuterVolumeSpecName: "utilities") pod "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" (UID: "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.873137 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45" (OuterVolumeSpecName: "kube-api-access-wtf45") pod "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" (UID: "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a"). InnerVolumeSpecName "kube-api-access-wtf45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.906802 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" (UID: "6a9a4f69-8d1a-4d88-b2cf-97c8c661899a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.965459 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.965504 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:47:11 crc kubenswrapper[4870]: I0130 08:47:11.965517 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtf45\" (UniqueName: \"kubernetes.io/projected/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a-kube-api-access-wtf45\") on node \"crc\" DevicePath \"\"" Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.640245 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trmvn" event={"ID":"6a9a4f69-8d1a-4d88-b2cf-97c8c661899a","Type":"ContainerDied","Data":"d437a8dec9fb30f00a4e34ade8f87279693722da031abd27b5c05ab979ca074a"} Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.640589 4870 scope.go:117] "RemoveContainer" containerID="d36182e2b07c4b05bbda48d653498be4113ed91928168f19497faa1185fccd8f" Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.640379 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trmvn" Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.666452 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.667309 4870 scope.go:117] "RemoveContainer" containerID="d0054c95b532e36424c3acf46688e0ebfdf292697edc56afd3f30e979d09653c" Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.676192 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-trmvn"] Jan 30 08:47:12 crc kubenswrapper[4870]: I0130 08:47:12.692554 4870 scope.go:117] "RemoveContainer" containerID="fb63265b6df908944c072b8b056481ba1574a9802aca45041ac50a3b50cbe3ff" Jan 30 08:47:14 crc kubenswrapper[4870]: I0130 08:47:14.096147 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" path="/var/lib/kubelet/pods/6a9a4f69-8d1a-4d88-b2cf-97c8c661899a/volumes" Jan 30 08:47:20 crc kubenswrapper[4870]: I0130 08:47:20.074725 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:47:20 crc kubenswrapper[4870]: E0130 08:47:20.075752 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:47:32 crc kubenswrapper[4870]: I0130 08:47:32.083022 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:47:32 crc kubenswrapper[4870]: E0130 08:47:32.086531 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:47:43 crc kubenswrapper[4870]: I0130 08:47:43.074610 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:47:43 crc kubenswrapper[4870]: E0130 08:47:43.075700 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:47:58 crc kubenswrapper[4870]: I0130 08:47:58.074638 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:47:58 crc kubenswrapper[4870]: E0130 08:47:58.075419 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:48:09 crc kubenswrapper[4870]: I0130 08:48:09.074602 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:48:09 crc kubenswrapper[4870]: E0130 08:48:09.075468 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:48:21 crc kubenswrapper[4870]: I0130 08:48:21.075297 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:48:21 crc kubenswrapper[4870]: E0130 08:48:21.076088 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:48:32 crc kubenswrapper[4870]: I0130 08:48:32.082866 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:48:32 crc kubenswrapper[4870]: E0130 08:48:32.084303 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:48:44 crc kubenswrapper[4870]: I0130 08:48:44.076020 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:48:44 crc kubenswrapper[4870]: E0130 08:48:44.077603 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:48:57 crc kubenswrapper[4870]: I0130 08:48:57.076171 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:48:57 crc kubenswrapper[4870]: E0130 08:48:57.077112 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:49:09 crc kubenswrapper[4870]: I0130 08:49:09.074568 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:49:09 crc kubenswrapper[4870]: E0130 08:49:09.076853 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:49:21 crc kubenswrapper[4870]: I0130 08:49:21.074768 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:49:21 crc kubenswrapper[4870]: E0130 08:49:21.075936 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:49:35 crc kubenswrapper[4870]: I0130 08:49:35.075378 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:49:35 crc kubenswrapper[4870]: E0130 08:49:35.080375 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:49:49 crc kubenswrapper[4870]: I0130 08:49:49.074396 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:49:49 crc kubenswrapper[4870]: E0130 08:49:49.075287 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:50:03 crc kubenswrapper[4870]: I0130 08:50:03.075092 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:50:03 crc kubenswrapper[4870]: E0130 08:50:03.075852 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:50:18 crc kubenswrapper[4870]: I0130 08:50:18.075171 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:50:18 crc kubenswrapper[4870]: E0130 08:50:18.076045 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:50:22 crc kubenswrapper[4870]: I0130 08:50:22.451354 4870 generic.go:334] "Generic (PLEG): container finished" podID="9e214e41-a575-467c-a053-d6807c4f1512" containerID="849cf3c1bde9a8f54182673096d806dd29610e89f52d8b6fde1f213230d3c284" exitCode=0 Jan 30 08:50:22 crc kubenswrapper[4870]: I0130 08:50:22.451899 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" event={"ID":"9e214e41-a575-467c-a053-d6807c4f1512","Type":"ContainerDied","Data":"849cf3c1bde9a8f54182673096d806dd29610e89f52d8b6fde1f213230d3c284"} Jan 30 08:50:23 crc kubenswrapper[4870]: I0130 08:50:23.892540 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.014869 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqzbr\" (UniqueName: \"kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr\") pod \"9e214e41-a575-467c-a053-d6807c4f1512\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.014989 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory\") pod \"9e214e41-a575-467c-a053-d6807c4f1512\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.015068 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0\") pod \"9e214e41-a575-467c-a053-d6807c4f1512\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.015228 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam\") pod \"9e214e41-a575-467c-a053-d6807c4f1512\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.015324 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle\") pod \"9e214e41-a575-467c-a053-d6807c4f1512\" (UID: \"9e214e41-a575-467c-a053-d6807c4f1512\") " Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.020385 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9e214e41-a575-467c-a053-d6807c4f1512" (UID: "9e214e41-a575-467c-a053-d6807c4f1512"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.020440 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr" (OuterVolumeSpecName: "kube-api-access-sqzbr") pod "9e214e41-a575-467c-a053-d6807c4f1512" (UID: "9e214e41-a575-467c-a053-d6807c4f1512"). InnerVolumeSpecName "kube-api-access-sqzbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.043191 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9e214e41-a575-467c-a053-d6807c4f1512" (UID: "9e214e41-a575-467c-a053-d6807c4f1512"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.044866 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory" (OuterVolumeSpecName: "inventory") pod "9e214e41-a575-467c-a053-d6807c4f1512" (UID: "9e214e41-a575-467c-a053-d6807c4f1512"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.055118 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e214e41-a575-467c-a053-d6807c4f1512" (UID: "9e214e41-a575-467c-a053-d6807c4f1512"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.117661 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.117996 4870 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.118006 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqzbr\" (UniqueName: \"kubernetes.io/projected/9e214e41-a575-467c-a053-d6807c4f1512-kube-api-access-sqzbr\") on node \"crc\" DevicePath \"\"" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.118018 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.118028 4870 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9e214e41-a575-467c-a053-d6807c4f1512-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.468246 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" event={"ID":"9e214e41-a575-467c-a053-d6807c4f1512","Type":"ContainerDied","Data":"958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c"} Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.468283 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="958101a8528cf0e341326792f2f6ae6259384318c3c17b73fec5224f1c834d1c" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.468304 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-26lfr" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.573496 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb"] Jan 30 08:50:24 crc kubenswrapper[4870]: E0130 08:50:24.573949 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="registry-server" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.573965 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="registry-server" Jan 30 08:50:24 crc kubenswrapper[4870]: E0130 08:50:24.573986 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="extract-utilities" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.573993 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="extract-utilities" Jan 30 08:50:24 crc kubenswrapper[4870]: E0130 08:50:24.574009 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e214e41-a575-467c-a053-d6807c4f1512" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.574016 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e214e41-a575-467c-a053-d6807c4f1512" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 08:50:24 crc kubenswrapper[4870]: E0130 08:50:24.574033 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="extract-content" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.574039 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="extract-content" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.574207 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e214e41-a575-467c-a053-d6807c4f1512" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.574226 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9a4f69-8d1a-4d88-b2cf-97c8c661899a" containerName="registry-server" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.574899 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.576820 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.576949 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.577351 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.577555 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.577928 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.583249 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb"] Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.585309 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.585953 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729036 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729095 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729267 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729491 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26f7t\" (UniqueName: \"kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729535 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729674 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.729732 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831676 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831746 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831823 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831868 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831936 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26f7t\" (UniqueName: \"kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.831969 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.832023 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.832055 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.832085 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.833166 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.838044 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.840592 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.841383 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.847937 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.847984 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.847951 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.848573 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.864731 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26f7t\" (UniqueName: \"kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4b7pb\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:24 crc kubenswrapper[4870]: I0130 08:50:24.896534 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:50:25 crc kubenswrapper[4870]: I0130 08:50:25.450607 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb"] Jan 30 08:50:25 crc kubenswrapper[4870]: I0130 08:50:25.477606 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" event={"ID":"da926ccc-5787-4741-a00c-1163494adb5e","Type":"ContainerStarted","Data":"bba6c00493ab2f4ae10ae8d8416c4437552d46fb802ef359a38e7c883b259426"} Jan 30 08:50:26 crc kubenswrapper[4870]: I0130 08:50:26.485841 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" event={"ID":"da926ccc-5787-4741-a00c-1163494adb5e","Type":"ContainerStarted","Data":"ce3daff1bc1672ba26e707a19c7baa5445c66c281c7820bc811779b2c7d174cf"} Jan 30 08:50:26 crc kubenswrapper[4870]: I0130 08:50:26.506993 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" podStartSLOduration=1.9629164399999999 podStartE2EDuration="2.506975423s" podCreationTimestamp="2026-01-30 08:50:24 +0000 UTC" firstStartedPulling="2026-01-30 08:50:25.45000204 +0000 UTC m=+2464.145549159" lastFinishedPulling="2026-01-30 08:50:25.994061033 +0000 UTC m=+2464.689608142" observedRunningTime="2026-01-30 08:50:26.499430268 +0000 UTC m=+2465.194977407" watchObservedRunningTime="2026-01-30 08:50:26.506975423 +0000 UTC m=+2465.202522532" Jan 30 08:50:33 crc kubenswrapper[4870]: I0130 08:50:33.075014 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:50:33 crc kubenswrapper[4870]: E0130 08:50:33.075919 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:50:46 crc kubenswrapper[4870]: I0130 08:50:46.074955 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:50:46 crc kubenswrapper[4870]: E0130 08:50:46.075626 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.313795 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.316939 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.324756 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.511055 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.511753 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc59p\" (UniqueName: \"kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.511830 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.613535 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.613687 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.613819 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc59p\" (UniqueName: \"kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.614109 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.614187 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.645164 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc59p\" (UniqueName: \"kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p\") pod \"redhat-marketplace-trcmx\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:49 crc kubenswrapper[4870]: I0130 08:50:49.945063 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:50 crc kubenswrapper[4870]: I0130 08:50:50.421832 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:50:50 crc kubenswrapper[4870]: I0130 08:50:50.730295 4870 generic.go:334] "Generic (PLEG): container finished" podID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerID="8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270" exitCode=0 Jan 30 08:50:50 crc kubenswrapper[4870]: I0130 08:50:50.730344 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerDied","Data":"8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270"} Jan 30 08:50:50 crc kubenswrapper[4870]: I0130 08:50:50.730619 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerStarted","Data":"8a75d01ae5384c4b2a2bea2de803d33bc1fef67f0fed796468b959dc601d5043"} Jan 30 08:50:52 crc kubenswrapper[4870]: I0130 08:50:52.750379 4870 generic.go:334] "Generic (PLEG): container finished" podID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerID="9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8" exitCode=0 Jan 30 08:50:52 crc kubenswrapper[4870]: I0130 08:50:52.750480 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerDied","Data":"9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8"} Jan 30 08:50:54 crc kubenswrapper[4870]: I0130 08:50:54.839336 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerStarted","Data":"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde"} Jan 30 08:50:54 crc kubenswrapper[4870]: I0130 08:50:54.878597 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trcmx" podStartSLOduration=3.110731107 podStartE2EDuration="5.87857718s" podCreationTimestamp="2026-01-30 08:50:49 +0000 UTC" firstStartedPulling="2026-01-30 08:50:50.731943182 +0000 UTC m=+2489.427490291" lastFinishedPulling="2026-01-30 08:50:53.499789235 +0000 UTC m=+2492.195336364" observedRunningTime="2026-01-30 08:50:54.875228016 +0000 UTC m=+2493.570775135" watchObservedRunningTime="2026-01-30 08:50:54.87857718 +0000 UTC m=+2493.574124289" Jan 30 08:50:59 crc kubenswrapper[4870]: I0130 08:50:59.945988 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:59 crc kubenswrapper[4870]: I0130 08:50:59.946574 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:50:59 crc kubenswrapper[4870]: I0130 08:50:59.998505 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:51:00 crc kubenswrapper[4870]: I0130 08:51:00.937635 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:51:00 crc kubenswrapper[4870]: I0130 08:51:00.982894 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:51:01 crc kubenswrapper[4870]: I0130 08:51:01.075557 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:51:01 crc kubenswrapper[4870]: E0130 08:51:01.075904 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:51:02 crc kubenswrapper[4870]: I0130 08:51:02.908486 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-trcmx" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="registry-server" containerID="cri-o://91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde" gracePeriod=2 Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.373887 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.544430 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc59p\" (UniqueName: \"kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p\") pod \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.544732 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities\") pod \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.544992 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content\") pod \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\" (UID: \"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec\") " Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.545923 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities" (OuterVolumeSpecName: "utilities") pod "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" (UID: "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.550779 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.561589 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p" (OuterVolumeSpecName: "kube-api-access-vc59p") pod "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" (UID: "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec"). InnerVolumeSpecName "kube-api-access-vc59p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.585517 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" (UID: "87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.654071 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc59p\" (UniqueName: \"kubernetes.io/projected/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-kube-api-access-vc59p\") on node \"crc\" DevicePath \"\"" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.654156 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.920363 4870 generic.go:334] "Generic (PLEG): container finished" podID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerID="91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde" exitCode=0 Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.920416 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerDied","Data":"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde"} Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.920453 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trcmx" event={"ID":"87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec","Type":"ContainerDied","Data":"8a75d01ae5384c4b2a2bea2de803d33bc1fef67f0fed796468b959dc601d5043"} Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.920473 4870 scope.go:117] "RemoveContainer" containerID="91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.920428 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trcmx" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.947444 4870 scope.go:117] "RemoveContainer" containerID="9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8" Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.957372 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.965729 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-trcmx"] Jan 30 08:51:03 crc kubenswrapper[4870]: I0130 08:51:03.982499 4870 scope.go:117] "RemoveContainer" containerID="8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.022451 4870 scope.go:117] "RemoveContainer" containerID="91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde" Jan 30 08:51:04 crc kubenswrapper[4870]: E0130 08:51:04.022985 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde\": container with ID starting with 91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde not found: ID does not exist" containerID="91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.023028 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde"} err="failed to get container status \"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde\": rpc error: code = NotFound desc = could not find container \"91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde\": container with ID starting with 91879a3d12cdb103682c88b869a7a8eafe18c4b3b76fbf459fb48f2a8c17fcde not found: ID does not exist" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.023083 4870 scope.go:117] "RemoveContainer" containerID="9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8" Jan 30 08:51:04 crc kubenswrapper[4870]: E0130 08:51:04.023543 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8\": container with ID starting with 9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8 not found: ID does not exist" containerID="9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.023594 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8"} err="failed to get container status \"9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8\": rpc error: code = NotFound desc = could not find container \"9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8\": container with ID starting with 9c90272b72bb2b233b86a8b1d5e7a016e253e833dc283bd605c0711e8cd1d6c8 not found: ID does not exist" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.023612 4870 scope.go:117] "RemoveContainer" containerID="8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270" Jan 30 08:51:04 crc kubenswrapper[4870]: E0130 08:51:04.024148 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270\": container with ID starting with 8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270 not found: ID does not exist" containerID="8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.024183 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270"} err="failed to get container status \"8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270\": rpc error: code = NotFound desc = could not find container \"8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270\": container with ID starting with 8a0125be7ea3ce2393a8d7b81dc51bb9070b89ed613cec2a0dd8fdb27d330270 not found: ID does not exist" Jan 30 08:51:04 crc kubenswrapper[4870]: I0130 08:51:04.087063 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" path="/var/lib/kubelet/pods/87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec/volumes" Jan 30 08:51:14 crc kubenswrapper[4870]: I0130 08:51:14.075186 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:51:14 crc kubenswrapper[4870]: E0130 08:51:14.076310 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:51:25 crc kubenswrapper[4870]: I0130 08:51:25.076242 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:51:25 crc kubenswrapper[4870]: E0130 08:51:25.076988 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:51:40 crc kubenswrapper[4870]: I0130 08:51:40.074464 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:51:40 crc kubenswrapper[4870]: E0130 08:51:40.075470 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:51:51 crc kubenswrapper[4870]: I0130 08:51:51.075364 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:51:51 crc kubenswrapper[4870]: E0130 08:51:51.076085 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:52:03 crc kubenswrapper[4870]: I0130 08:52:03.075997 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:52:03 crc kubenswrapper[4870]: I0130 08:52:03.431965 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4"} Jan 30 08:52:44 crc kubenswrapper[4870]: I0130 08:52:44.860322 4870 generic.go:334] "Generic (PLEG): container finished" podID="da926ccc-5787-4741-a00c-1163494adb5e" containerID="ce3daff1bc1672ba26e707a19c7baa5445c66c281c7820bc811779b2c7d174cf" exitCode=0 Jan 30 08:52:44 crc kubenswrapper[4870]: I0130 08:52:44.860374 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" event={"ID":"da926ccc-5787-4741-a00c-1163494adb5e","Type":"ContainerDied","Data":"ce3daff1bc1672ba26e707a19c7baa5445c66c281c7820bc811779b2c7d174cf"} Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.400680 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.450171 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.450476 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.450653 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.450965 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.451520 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.451640 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.451771 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.451925 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.452040 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26f7t\" (UniqueName: \"kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t\") pod \"da926ccc-5787-4741-a00c-1163494adb5e\" (UID: \"da926ccc-5787-4741-a00c-1163494adb5e\") " Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.457213 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t" (OuterVolumeSpecName: "kube-api-access-26f7t") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "kube-api-access-26f7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.457322 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.484335 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory" (OuterVolumeSpecName: "inventory") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.486289 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.486512 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.486776 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.493205 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.497606 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.498545 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da926ccc-5787-4741-a00c-1163494adb5e" (UID: "da926ccc-5787-4741-a00c-1163494adb5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.554929 4870 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.554981 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.554991 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555000 4870 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555009 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26f7t\" (UniqueName: \"kubernetes.io/projected/da926ccc-5787-4741-a00c-1163494adb5e-kube-api-access-26f7t\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555021 4870 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/da926ccc-5787-4741-a00c-1163494adb5e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555030 4870 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555038 4870 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.555051 4870 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da926ccc-5787-4741-a00c-1163494adb5e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.883031 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" event={"ID":"da926ccc-5787-4741-a00c-1163494adb5e","Type":"ContainerDied","Data":"bba6c00493ab2f4ae10ae8d8416c4437552d46fb802ef359a38e7c883b259426"} Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.883101 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba6c00493ab2f4ae10ae8d8416c4437552d46fb802ef359a38e7c883b259426" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.883051 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4b7pb" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974173 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz"] Jan 30 08:52:46 crc kubenswrapper[4870]: E0130 08:52:46.974554 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="extract-utilities" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974570 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="extract-utilities" Jan 30 08:52:46 crc kubenswrapper[4870]: E0130 08:52:46.974588 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da926ccc-5787-4741-a00c-1163494adb5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974595 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="da926ccc-5787-4741-a00c-1163494adb5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 08:52:46 crc kubenswrapper[4870]: E0130 08:52:46.974621 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="extract-content" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974627 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="extract-content" Jan 30 08:52:46 crc kubenswrapper[4870]: E0130 08:52:46.974637 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="registry-server" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974643 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="registry-server" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974812 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f5cab1-c7ab-4b3f-9ae6-f050d22c06ec" containerName="registry-server" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.974832 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="da926ccc-5787-4741-a00c-1163494adb5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.975439 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.985696 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz"] Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.986015 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.986175 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.986335 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 08:52:46 crc kubenswrapper[4870]: I0130 08:52:46.986754 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n6f5w" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.002483 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.064716 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.064829 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdzp2\" (UniqueName: \"kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.064907 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.065015 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.065156 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.065206 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.065284 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167408 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167500 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167563 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167638 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167765 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdzp2\" (UniqueName: \"kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167899 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.167949 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.174148 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.174422 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.174957 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.176118 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.181277 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.182015 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.193227 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdzp2\" (UniqueName: \"kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.312109 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.824158 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.826608 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz"] Jan 30 08:52:47 crc kubenswrapper[4870]: I0130 08:52:47.894662 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" event={"ID":"1e93cbad-07e7-4073-a577-b666a6901a1d","Type":"ContainerStarted","Data":"20ba77c8c428673fee3b11764831b8a484f339cb904a99f86474d502491153f0"} Jan 30 08:52:49 crc kubenswrapper[4870]: I0130 08:52:49.913513 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" event={"ID":"1e93cbad-07e7-4073-a577-b666a6901a1d","Type":"ContainerStarted","Data":"985b10315ccae1ef801aaf273960bc89bc06483a8a1faada7eb0a77adb779982"} Jan 30 08:52:49 crc kubenswrapper[4870]: I0130 08:52:49.933856 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" podStartSLOduration=2.95598236 podStartE2EDuration="3.933827783s" podCreationTimestamp="2026-01-30 08:52:46 +0000 UTC" firstStartedPulling="2026-01-30 08:52:47.823686273 +0000 UTC m=+2606.519233382" lastFinishedPulling="2026-01-30 08:52:48.801531686 +0000 UTC m=+2607.497078805" observedRunningTime="2026-01-30 08:52:49.930276102 +0000 UTC m=+2608.625823211" watchObservedRunningTime="2026-01-30 08:52:49.933827783 +0000 UTC m=+2608.629374892" Jan 30 08:54:25 crc kubenswrapper[4870]: I0130 08:54:25.249493 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:54:25 crc kubenswrapper[4870]: I0130 08:54:25.250023 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:54:51 crc kubenswrapper[4870]: I0130 08:54:51.075378 4870 generic.go:334] "Generic (PLEG): container finished" podID="1e93cbad-07e7-4073-a577-b666a6901a1d" containerID="985b10315ccae1ef801aaf273960bc89bc06483a8a1faada7eb0a77adb779982" exitCode=0 Jan 30 08:54:51 crc kubenswrapper[4870]: I0130 08:54:51.075459 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" event={"ID":"1e93cbad-07e7-4073-a577-b666a6901a1d","Type":"ContainerDied","Data":"985b10315ccae1ef801aaf273960bc89bc06483a8a1faada7eb0a77adb779982"} Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.608896 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.755832 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdzp2\" (UniqueName: \"kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756110 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756157 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756237 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756265 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756287 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.756359 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0\") pod \"1e93cbad-07e7-4073-a577-b666a6901a1d\" (UID: \"1e93cbad-07e7-4073-a577-b666a6901a1d\") " Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.765198 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.765418 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2" (OuterVolumeSpecName: "kube-api-access-gdzp2") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "kube-api-access-gdzp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.797318 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory" (OuterVolumeSpecName: "inventory") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.798008 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.798811 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.799107 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.801111 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1e93cbad-07e7-4073-a577-b666a6901a1d" (UID: "1e93cbad-07e7-4073-a577-b666a6901a1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859000 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdzp2\" (UniqueName: \"kubernetes.io/projected/1e93cbad-07e7-4073-a577-b666a6901a1d-kube-api-access-gdzp2\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859050 4870 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859066 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859078 4870 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859094 4870 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859107 4870 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:52 crc kubenswrapper[4870]: I0130 08:54:52.859119 4870 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1e93cbad-07e7-4073-a577-b666a6901a1d-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:54:53 crc kubenswrapper[4870]: I0130 08:54:53.127354 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" event={"ID":"1e93cbad-07e7-4073-a577-b666a6901a1d","Type":"ContainerDied","Data":"20ba77c8c428673fee3b11764831b8a484f339cb904a99f86474d502491153f0"} Jan 30 08:54:53 crc kubenswrapper[4870]: I0130 08:54:53.127421 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ba77c8c428673fee3b11764831b8a484f339cb904a99f86474d502491153f0" Jan 30 08:54:53 crc kubenswrapper[4870]: I0130 08:54:53.127445 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz" Jan 30 08:54:55 crc kubenswrapper[4870]: I0130 08:54:55.249795 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:54:55 crc kubenswrapper[4870]: I0130 08:54:55.250226 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.806270 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:55:14 crc kubenswrapper[4870]: E0130 08:55:14.808201 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e93cbad-07e7-4073-a577-b666a6901a1d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.808219 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e93cbad-07e7-4073-a577-b666a6901a1d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.808657 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e93cbad-07e7-4073-a577-b666a6901a1d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.810153 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.832855 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.905218 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.905811 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:14 crc kubenswrapper[4870]: I0130 08:55:14.905934 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdrw\" (UniqueName: \"kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.007674 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.007782 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.007909 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdrw\" (UniqueName: \"kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.008966 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.009260 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.032799 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdrw\" (UniqueName: \"kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw\") pod \"redhat-operators-ttj46\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.184292 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:15 crc kubenswrapper[4870]: I0130 08:55:15.697836 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:55:16 crc kubenswrapper[4870]: I0130 08:55:16.362231 4870 generic.go:334] "Generic (PLEG): container finished" podID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerID="4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223" exitCode=0 Jan 30 08:55:16 crc kubenswrapper[4870]: I0130 08:55:16.362366 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerDied","Data":"4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223"} Jan 30 08:55:16 crc kubenswrapper[4870]: I0130 08:55:16.362596 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerStarted","Data":"edcb4436070891faee3a6385c93795c073a934cb14d6e9e2233f2ac428560b42"} Jan 30 08:55:18 crc kubenswrapper[4870]: I0130 08:55:18.426160 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerStarted","Data":"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55"} Jan 30 08:55:20 crc kubenswrapper[4870]: I0130 08:55:20.443924 4870 generic.go:334] "Generic (PLEG): container finished" podID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerID="c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55" exitCode=0 Jan 30 08:55:20 crc kubenswrapper[4870]: I0130 08:55:20.444189 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerDied","Data":"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55"} Jan 30 08:55:22 crc kubenswrapper[4870]: I0130 08:55:22.463989 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerStarted","Data":"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb"} Jan 30 08:55:22 crc kubenswrapper[4870]: I0130 08:55:22.495806 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttj46" podStartSLOduration=3.214222975 podStartE2EDuration="8.495788539s" podCreationTimestamp="2026-01-30 08:55:14 +0000 UTC" firstStartedPulling="2026-01-30 08:55:16.364409975 +0000 UTC m=+2755.059957084" lastFinishedPulling="2026-01-30 08:55:21.645975519 +0000 UTC m=+2760.341522648" observedRunningTime="2026-01-30 08:55:22.488536712 +0000 UTC m=+2761.184083821" watchObservedRunningTime="2026-01-30 08:55:22.495788539 +0000 UTC m=+2761.191335648" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.185624 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.186169 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.249997 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.250054 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.250104 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.250901 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.250966 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4" gracePeriod=600 Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.491764 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4" exitCode=0 Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.491817 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4"} Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.491867 4870 scope.go:117] "RemoveContainer" containerID="c5e496759a03317cce950bd3f63bcb5ceea02cbee0d1be311b1868af5eaedaac" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.853074 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.857919 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.861188 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.869907 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917118 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917185 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917624 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917686 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-run\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917730 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917809 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917859 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-scripts\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.917929 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2g6\" (UniqueName: \"kubernetes.io/projected/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-kube-api-access-mt2g6\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918001 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-dev\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918052 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-sys\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918070 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918087 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918196 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918286 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.918425 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-lib-modules\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.966476 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.968427 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.970624 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Jan 30 08:55:25 crc kubenswrapper[4870]: I0130 08:55:25.993078 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.002980 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.004750 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.006621 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.021989 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-lib-modules\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022061 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022095 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022124 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022147 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-run\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022171 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022214 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022246 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-scripts\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022276 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2g6\" (UniqueName: \"kubernetes.io/projected/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-kube-api-access-mt2g6\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022310 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-dev\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022334 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-sys\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022353 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022382 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022428 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022490 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022610 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022653 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-lib-modules\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.022915 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024559 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024615 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-run\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024818 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024894 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-dev\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024923 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-sys\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.024948 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.025654 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.031103 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.037537 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.041994 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.044939 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-scripts\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.057873 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2g6\" (UniqueName: \"kubernetes.io/projected/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-kube-api-access-mt2g6\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.058659 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecf4070-2dd9-496d-bf4d-7f456eb6ed72-config-data\") pod \"cinder-backup-0\" (UID: \"cecf4070-2dd9-496d-bf4d-7f456eb6ed72\") " pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125320 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t92n5\" (UniqueName: \"kubernetes.io/projected/56215e10-017e-4662-92ab-8f25178c0fab-kube-api-access-t92n5\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125377 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125407 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125424 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125444 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-run\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125468 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125487 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125508 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125547 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125566 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125590 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125605 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125620 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125635 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125680 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125715 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125745 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125770 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125792 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125812 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125835 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125860 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125937 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.125976 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126007 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126043 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126253 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126332 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126409 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.126547 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjbs\" (UniqueName: \"kubernetes.io/projected/06465a52-3f34-45fd-b95e-e679adcb59e6-kube-api-access-lrjbs\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.217980 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228357 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjbs\" (UniqueName: \"kubernetes.io/projected/06465a52-3f34-45fd-b95e-e679adcb59e6-kube-api-access-lrjbs\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228430 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t92n5\" (UniqueName: \"kubernetes.io/projected/56215e10-017e-4662-92ab-8f25178c0fab-kube-api-access-t92n5\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228450 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228558 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228690 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228468 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-dev\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228825 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228850 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-run\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228889 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228909 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228931 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228967 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.228986 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229009 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229024 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229039 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229055 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229093 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229117 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229139 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229154 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229186 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229202 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229220 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229239 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229266 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229289 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229312 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229333 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229361 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229376 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229404 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.229436 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230249 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-run\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230306 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230360 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230392 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230530 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230529 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230612 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230648 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-sys\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230684 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230710 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230743 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.230791 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.231364 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.231432 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.231466 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.231504 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/06465a52-3f34-45fd-b95e-e679adcb59e6-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.231911 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56215e10-017e-4662-92ab-8f25178c0fab-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.234163 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttj46" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" probeResult="failure" output=< Jan 30 08:55:26 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:55:26 crc kubenswrapper[4870]: > Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.234598 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.235022 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.236000 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.236862 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.239432 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.239839 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.242628 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56215e10-017e-4662-92ab-8f25178c0fab-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.248703 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06465a52-3f34-45fd-b95e-e679adcb59e6-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.250828 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjbs\" (UniqueName: \"kubernetes.io/projected/06465a52-3f34-45fd-b95e-e679adcb59e6-kube-api-access-lrjbs\") pod \"cinder-volume-nfs-0\" (UID: \"06465a52-3f34-45fd-b95e-e679adcb59e6\") " pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.251039 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t92n5\" (UniqueName: \"kubernetes.io/projected/56215e10-017e-4662-92ab-8f25178c0fab-kube-api-access-t92n5\") pod \"cinder-volume-nfs-2-0\" (UID: \"56215e10-017e-4662-92ab-8f25178c0fab\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.285808 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.439195 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.553317 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283"} Jan 30 08:55:26 crc kubenswrapper[4870]: I0130 08:55:26.913167 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 08:55:27 crc kubenswrapper[4870]: I0130 08:55:27.132744 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 08:55:27 crc kubenswrapper[4870]: I0130 08:55:27.253172 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 08:55:27 crc kubenswrapper[4870]: W0130 08:55:27.445938 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06465a52_3f34_45fd_b95e_e679adcb59e6.slice/crio-d7dfbf5cd33ed1cccec4aaeb8099c65062912abd65c6c1a41e5269770529fbef WatchSource:0}: Error finding container d7dfbf5cd33ed1cccec4aaeb8099c65062912abd65c6c1a41e5269770529fbef: Status 404 returned error can't find the container with id d7dfbf5cd33ed1cccec4aaeb8099c65062912abd65c6c1a41e5269770529fbef Jan 30 08:55:27 crc kubenswrapper[4870]: I0130 08:55:27.587017 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"56215e10-017e-4662-92ab-8f25178c0fab","Type":"ContainerStarted","Data":"d6ff31af0b9c693f7878c9396efb9be4b00263c5fd248dfa95cc3b26021e7e19"} Jan 30 08:55:27 crc kubenswrapper[4870]: I0130 08:55:27.618577 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cecf4070-2dd9-496d-bf4d-7f456eb6ed72","Type":"ContainerStarted","Data":"23e8b8de0d67143b1cf5603519e372944b671a9479a54617f6f64c87ac458e6d"} Jan 30 08:55:27 crc kubenswrapper[4870]: I0130 08:55:27.636976 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"06465a52-3f34-45fd-b95e-e679adcb59e6","Type":"ContainerStarted","Data":"d7dfbf5cd33ed1cccec4aaeb8099c65062912abd65c6c1a41e5269770529fbef"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.648163 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"06465a52-3f34-45fd-b95e-e679adcb59e6","Type":"ContainerStarted","Data":"858df5bdce00185c5e4fe9ee5ded6e1b854d84ac8baf1197016c54a12d797c4a"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.648742 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"06465a52-3f34-45fd-b95e-e679adcb59e6","Type":"ContainerStarted","Data":"16c32da4b856d2522ef7f18800a180372b7266486fc968f6e1635bbd59831b3f"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.657652 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"56215e10-017e-4662-92ab-8f25178c0fab","Type":"ContainerStarted","Data":"adaa71de8c46c4ed98102bf2d93148478c3797029dd38cab3a7e607f27abee56"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.657698 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"56215e10-017e-4662-92ab-8f25178c0fab","Type":"ContainerStarted","Data":"1a426c582aed3f1f227d92cdaccf150a841885fc6541669ad6d0a2e1ea570008"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.661976 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cecf4070-2dd9-496d-bf4d-7f456eb6ed72","Type":"ContainerStarted","Data":"21d9690d3844574cbfcb154865410a91fa68f0a2b1537f5109cd02161959d703"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.662564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cecf4070-2dd9-496d-bf4d-7f456eb6ed72","Type":"ContainerStarted","Data":"78c12f1ff4d4aa25d08587cb6d42b529c6ac38ece09f08fd2bf4b617cb236fe6"} Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.674295 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.520329712 podStartE2EDuration="3.674279214s" podCreationTimestamp="2026-01-30 08:55:25 +0000 UTC" firstStartedPulling="2026-01-30 08:55:27.456649319 +0000 UTC m=+2766.152196418" lastFinishedPulling="2026-01-30 08:55:27.610598821 +0000 UTC m=+2766.306145920" observedRunningTime="2026-01-30 08:55:28.672510829 +0000 UTC m=+2767.368057928" watchObservedRunningTime="2026-01-30 08:55:28.674279214 +0000 UTC m=+2767.369826323" Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.711898 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.344351473 podStartE2EDuration="3.711850918s" podCreationTimestamp="2026-01-30 08:55:25 +0000 UTC" firstStartedPulling="2026-01-30 08:55:27.15525255 +0000 UTC m=+2765.850799659" lastFinishedPulling="2026-01-30 08:55:27.522751985 +0000 UTC m=+2766.218299104" observedRunningTime="2026-01-30 08:55:28.703244739 +0000 UTC m=+2767.398791858" watchObservedRunningTime="2026-01-30 08:55:28.711850918 +0000 UTC m=+2767.407398027" Jan 30 08:55:28 crc kubenswrapper[4870]: I0130 08:55:28.731921 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.496258669 podStartE2EDuration="3.731903834s" podCreationTimestamp="2026-01-30 08:55:25 +0000 UTC" firstStartedPulling="2026-01-30 08:55:26.919682858 +0000 UTC m=+2765.615229967" lastFinishedPulling="2026-01-30 08:55:27.155328013 +0000 UTC m=+2765.850875132" observedRunningTime="2026-01-30 08:55:28.728167028 +0000 UTC m=+2767.423714137" watchObservedRunningTime="2026-01-30 08:55:28.731903834 +0000 UTC m=+2767.427450943" Jan 30 08:55:31 crc kubenswrapper[4870]: I0130 08:55:31.218156 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 30 08:55:31 crc kubenswrapper[4870]: I0130 08:55:31.287091 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:31 crc kubenswrapper[4870]: I0130 08:55:31.439597 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:36 crc kubenswrapper[4870]: I0130 08:55:36.262170 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttj46" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" probeResult="failure" output=< Jan 30 08:55:36 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:55:36 crc kubenswrapper[4870]: > Jan 30 08:55:36 crc kubenswrapper[4870]: I0130 08:55:36.464137 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 30 08:55:37 crc kubenswrapper[4870]: I0130 08:55:37.040285 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Jan 30 08:55:37 crc kubenswrapper[4870]: I0130 08:55:37.130048 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Jan 30 08:55:46 crc kubenswrapper[4870]: I0130 08:55:46.236222 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttj46" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" probeResult="failure" output=< Jan 30 08:55:46 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:55:46 crc kubenswrapper[4870]: > Jan 30 08:55:56 crc kubenswrapper[4870]: I0130 08:55:56.232387 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttj46" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" probeResult="failure" output=< Jan 30 08:55:56 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 08:55:56 crc kubenswrapper[4870]: > Jan 30 08:56:05 crc kubenswrapper[4870]: I0130 08:56:05.236801 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:56:05 crc kubenswrapper[4870]: I0130 08:56:05.288070 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:56:05 crc kubenswrapper[4870]: I0130 08:56:05.475140 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.059225 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttj46" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" containerID="cri-o://244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb" gracePeriod=2 Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.600682 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.786977 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities\") pod \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.787448 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content\") pod \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.787570 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqdrw\" (UniqueName: \"kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw\") pod \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\" (UID: \"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8\") " Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.789203 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities" (OuterVolumeSpecName: "utilities") pod "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" (UID: "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.794925 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw" (OuterVolumeSpecName: "kube-api-access-zqdrw") pod "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" (UID: "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8"). InnerVolumeSpecName "kube-api-access-zqdrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.890756 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqdrw\" (UniqueName: \"kubernetes.io/projected/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-kube-api-access-zqdrw\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.890824 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.903399 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" (UID: "b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:56:07 crc kubenswrapper[4870]: I0130 08:56:07.991822 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.070980 4870 generic.go:334] "Generic (PLEG): container finished" podID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerID="244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb" exitCode=0 Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.071030 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerDied","Data":"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb"} Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.071063 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttj46" event={"ID":"b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8","Type":"ContainerDied","Data":"edcb4436070891faee3a6385c93795c073a934cb14d6e9e2233f2ac428560b42"} Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.071082 4870 scope.go:117] "RemoveContainer" containerID="244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.071961 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttj46" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.124782 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.131775 4870 scope.go:117] "RemoveContainer" containerID="c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.134644 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttj46"] Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.154094 4870 scope.go:117] "RemoveContainer" containerID="4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.194285 4870 scope.go:117] "RemoveContainer" containerID="244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb" Jan 30 08:56:08 crc kubenswrapper[4870]: E0130 08:56:08.194695 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb\": container with ID starting with 244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb not found: ID does not exist" containerID="244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.194732 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb"} err="failed to get container status \"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb\": rpc error: code = NotFound desc = could not find container \"244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb\": container with ID starting with 244c17a2d3dadb1346e7260f17f9015369b980c0ff0706d749ce8a104161debb not found: ID does not exist" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.194755 4870 scope.go:117] "RemoveContainer" containerID="c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55" Jan 30 08:56:08 crc kubenswrapper[4870]: E0130 08:56:08.195216 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55\": container with ID starting with c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55 not found: ID does not exist" containerID="c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.195246 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55"} err="failed to get container status \"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55\": rpc error: code = NotFound desc = could not find container \"c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55\": container with ID starting with c750e14ec0d483905810cdf2c55155d8e55a2cd9015dc13d53c80e893c497f55 not found: ID does not exist" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.195262 4870 scope.go:117] "RemoveContainer" containerID="4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223" Jan 30 08:56:08 crc kubenswrapper[4870]: E0130 08:56:08.196198 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223\": container with ID starting with 4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223 not found: ID does not exist" containerID="4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223" Jan 30 08:56:08 crc kubenswrapper[4870]: I0130 08:56:08.196340 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223"} err="failed to get container status \"4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223\": rpc error: code = NotFound desc = could not find container \"4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223\": container with ID starting with 4b0288aa2f1e5ed2aa5e341896bd0a25bf0ee307d0d5a1b59eeeb261aa26f223 not found: ID does not exist" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.890259 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:09 crc kubenswrapper[4870]: E0130 08:56:09.891319 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="extract-content" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.891344 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="extract-content" Jan 30 08:56:09 crc kubenswrapper[4870]: E0130 08:56:09.891387 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.891400 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" Jan 30 08:56:09 crc kubenswrapper[4870]: E0130 08:56:09.891431 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="extract-utilities" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.891443 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="extract-utilities" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.891780 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" containerName="registry-server" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.894979 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.903556 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.932568 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.932711 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:09 crc kubenswrapper[4870]: I0130 08:56:09.932908 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7sw\" (UniqueName: \"kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.039973 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7sw\" (UniqueName: \"kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.040115 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.040238 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.040894 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.041510 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.061584 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7sw\" (UniqueName: \"kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw\") pod \"community-operators-6jj6s\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.110141 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8" path="/var/lib/kubelet/pods/b88d5fc4-6cdd-4231-97b2-6e9dda7a33b8/volumes" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.220552 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:10 crc kubenswrapper[4870]: I0130 08:56:10.792290 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:11 crc kubenswrapper[4870]: I0130 08:56:11.121502 4870 generic.go:334] "Generic (PLEG): container finished" podID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerID="9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a" exitCode=0 Jan 30 08:56:11 crc kubenswrapper[4870]: I0130 08:56:11.121551 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerDied","Data":"9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a"} Jan 30 08:56:11 crc kubenswrapper[4870]: I0130 08:56:11.121581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerStarted","Data":"1397bf99db3c0cbef19f957fbbdb2abd4d8aa60d8ee4d5dd3023a74c9e29c5cd"} Jan 30 08:56:12 crc kubenswrapper[4870]: I0130 08:56:12.131176 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerStarted","Data":"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014"} Jan 30 08:56:13 crc kubenswrapper[4870]: E0130 08:56:13.900930 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b5a042_8bb0_474f_bc28_7d116341bf06.slice/crio-f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b5a042_8bb0_474f_bc28_7d116341bf06.slice/crio-conmon-f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014.scope\": RecentStats: unable to find data in memory cache]" Jan 30 08:56:14 crc kubenswrapper[4870]: I0130 08:56:14.148807 4870 generic.go:334] "Generic (PLEG): container finished" podID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerID="f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014" exitCode=0 Jan 30 08:56:14 crc kubenswrapper[4870]: I0130 08:56:14.148903 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerDied","Data":"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014"} Jan 30 08:56:15 crc kubenswrapper[4870]: I0130 08:56:15.160279 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerStarted","Data":"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b"} Jan 30 08:56:15 crc kubenswrapper[4870]: I0130 08:56:15.185408 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6jj6s" podStartSLOduration=2.746153812 podStartE2EDuration="6.185385988s" podCreationTimestamp="2026-01-30 08:56:09 +0000 UTC" firstStartedPulling="2026-01-30 08:56:11.123665338 +0000 UTC m=+2809.819212447" lastFinishedPulling="2026-01-30 08:56:14.562897514 +0000 UTC m=+2813.258444623" observedRunningTime="2026-01-30 08:56:15.177554284 +0000 UTC m=+2813.873101393" watchObservedRunningTime="2026-01-30 08:56:15.185385988 +0000 UTC m=+2813.880933117" Jan 30 08:56:20 crc kubenswrapper[4870]: I0130 08:56:20.220682 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:20 crc kubenswrapper[4870]: I0130 08:56:20.221173 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:20 crc kubenswrapper[4870]: I0130 08:56:20.280574 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:21 crc kubenswrapper[4870]: I0130 08:56:21.291196 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:21 crc kubenswrapper[4870]: I0130 08:56:21.354836 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.245075 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6jj6s" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="registry-server" containerID="cri-o://3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b" gracePeriod=2 Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.742892 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.845862 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn7sw\" (UniqueName: \"kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw\") pod \"52b5a042-8bb0-474f-bc28-7d116341bf06\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.846052 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content\") pod \"52b5a042-8bb0-474f-bc28-7d116341bf06\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.846318 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities\") pod \"52b5a042-8bb0-474f-bc28-7d116341bf06\" (UID: \"52b5a042-8bb0-474f-bc28-7d116341bf06\") " Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.847306 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities" (OuterVolumeSpecName: "utilities") pod "52b5a042-8bb0-474f-bc28-7d116341bf06" (UID: "52b5a042-8bb0-474f-bc28-7d116341bf06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.865230 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw" (OuterVolumeSpecName: "kube-api-access-wn7sw") pod "52b5a042-8bb0-474f-bc28-7d116341bf06" (UID: "52b5a042-8bb0-474f-bc28-7d116341bf06"). InnerVolumeSpecName "kube-api-access-wn7sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.901405 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b5a042-8bb0-474f-bc28-7d116341bf06" (UID: "52b5a042-8bb0-474f-bc28-7d116341bf06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.948606 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.948642 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b5a042-8bb0-474f-bc28-7d116341bf06-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:23 crc kubenswrapper[4870]: I0130 08:56:23.948656 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn7sw\" (UniqueName: \"kubernetes.io/projected/52b5a042-8bb0-474f-bc28-7d116341bf06-kube-api-access-wn7sw\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:24 crc kubenswrapper[4870]: E0130 08:56:24.174989 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b5a042_8bb0_474f_bc28_7d116341bf06.slice\": RecentStats: unable to find data in memory cache]" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.257522 4870 generic.go:334] "Generic (PLEG): container finished" podID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerID="3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b" exitCode=0 Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.257618 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerDied","Data":"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b"} Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.257821 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jj6s" event={"ID":"52b5a042-8bb0-474f-bc28-7d116341bf06","Type":"ContainerDied","Data":"1397bf99db3c0cbef19f957fbbdb2abd4d8aa60d8ee4d5dd3023a74c9e29c5cd"} Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.257849 4870 scope.go:117] "RemoveContainer" containerID="3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.257669 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jj6s" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.280628 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.289247 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6jj6s"] Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.291573 4870 scope.go:117] "RemoveContainer" containerID="f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.323600 4870 scope.go:117] "RemoveContainer" containerID="9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.365169 4870 scope.go:117] "RemoveContainer" containerID="3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b" Jan 30 08:56:24 crc kubenswrapper[4870]: E0130 08:56:24.365987 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b\": container with ID starting with 3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b not found: ID does not exist" containerID="3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.366037 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b"} err="failed to get container status \"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b\": rpc error: code = NotFound desc = could not find container \"3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b\": container with ID starting with 3235f9fe80a91166263d30df27913fe34690c14c54edeacf1eb8d9e013e8060b not found: ID does not exist" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.366078 4870 scope.go:117] "RemoveContainer" containerID="f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014" Jan 30 08:56:24 crc kubenswrapper[4870]: E0130 08:56:24.366816 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014\": container with ID starting with f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014 not found: ID does not exist" containerID="f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.366860 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014"} err="failed to get container status \"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014\": rpc error: code = NotFound desc = could not find container \"f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014\": container with ID starting with f880e1cfa9a280bc4a25e948a919389702e514b95714f1dee0c5caa6cc6ce014 not found: ID does not exist" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.366912 4870 scope.go:117] "RemoveContainer" containerID="9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a" Jan 30 08:56:24 crc kubenswrapper[4870]: E0130 08:56:24.367361 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a\": container with ID starting with 9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a not found: ID does not exist" containerID="9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a" Jan 30 08:56:24 crc kubenswrapper[4870]: I0130 08:56:24.367533 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a"} err="failed to get container status \"9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a\": rpc error: code = NotFound desc = could not find container \"9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a\": container with ID starting with 9534298d81270f4262590015fd5426c5e04ac73af7f2212e9942a942047e072a not found: ID does not exist" Jan 30 08:56:26 crc kubenswrapper[4870]: I0130 08:56:26.093646 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" path="/var/lib/kubelet/pods/52b5a042-8bb0-474f-bc28-7d116341bf06/volumes" Jan 30 08:56:26 crc kubenswrapper[4870]: I0130 08:56:26.839681 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:26 crc kubenswrapper[4870]: I0130 08:56:26.840238 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="prometheus" containerID="cri-o://ea102a1406731d57700d5196e250072c2053fa2345212a3d6975e629610cb94c" gracePeriod=600 Jan 30 08:56:26 crc kubenswrapper[4870]: I0130 08:56:26.840318 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="thanos-sidecar" containerID="cri-o://d46fd5e887baec843bdd4f9f0254772bf4dc50323e052cd052dd2ea4657b7397" gracePeriod=600 Jan 30 08:56:26 crc kubenswrapper[4870]: I0130 08:56:26.840379 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="config-reloader" containerID="cri-o://a9347a512a592b79cb85be5a5a664bfadec21fed65bd7eacf1a97eb008166eb1" gracePeriod=600 Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292034 4870 generic.go:334] "Generic (PLEG): container finished" podID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerID="d46fd5e887baec843bdd4f9f0254772bf4dc50323e052cd052dd2ea4657b7397" exitCode=0 Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292073 4870 generic.go:334] "Generic (PLEG): container finished" podID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerID="a9347a512a592b79cb85be5a5a664bfadec21fed65bd7eacf1a97eb008166eb1" exitCode=0 Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292083 4870 generic.go:334] "Generic (PLEG): container finished" podID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerID="ea102a1406731d57700d5196e250072c2053fa2345212a3d6975e629610cb94c" exitCode=0 Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292111 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerDied","Data":"d46fd5e887baec843bdd4f9f0254772bf4dc50323e052cd052dd2ea4657b7397"} Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292142 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerDied","Data":"a9347a512a592b79cb85be5a5a664bfadec21fed65bd7eacf1a97eb008166eb1"} Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.292155 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerDied","Data":"ea102a1406731d57700d5196e250072c2053fa2345212a3d6975e629610cb94c"} Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.435426 4870 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.135:9090/-/ready\": dial tcp 10.217.0.135:9090: connect: connection refused" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.860932 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.930082 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.930356 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.930575 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.930663 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.930803 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.931680 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.931789 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.931956 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932044 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932138 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932211 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5j9q\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932304 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932382 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file\") pod \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\" (UID: \"8c8b2056-4db2-489e-b1d1-b201e38e84c8\") " Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.932507 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.933019 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.933463 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.936784 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.941950 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.942691 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config" (OuterVolumeSpecName: "config") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.942869 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q" (OuterVolumeSpecName: "kube-api-access-f5j9q") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "kube-api-access-f5j9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.946767 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.948214 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.951730 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.961170 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out" (OuterVolumeSpecName: "config-out") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:56:27 crc kubenswrapper[4870]: I0130 08:56:27.963478 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.000493 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.035989 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036054 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") on node \"crc\" " Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036071 4870 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036085 4870 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036102 4870 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036113 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5j9q\" (UniqueName: \"kubernetes.io/projected/8c8b2056-4db2-489e-b1d1-b201e38e84c8-kube-api-access-f5j9q\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036125 4870 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036137 4870 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036149 4870 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036160 4870 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8c8b2056-4db2-489e-b1d1-b201e38e84c8-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.036170 4870 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/8c8b2056-4db2-489e-b1d1-b201e38e84c8-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.065098 4870 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.065261 4870 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1") on node "crc" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.100896 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config" (OuterVolumeSpecName: "web-config") pod "8c8b2056-4db2-489e-b1d1-b201e38e84c8" (UID: "8c8b2056-4db2-489e-b1d1-b201e38e84c8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.139995 4870 reconciler_common.go:293] "Volume detached for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.140030 4870 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8c8b2056-4db2-489e-b1d1-b201e38e84c8-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.304920 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8c8b2056-4db2-489e-b1d1-b201e38e84c8","Type":"ContainerDied","Data":"290c2383e9eae83628bb57bb648be794756981366804a5738c5a44985dd7ad40"} Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.304977 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.304985 4870 scope.go:117] "RemoveContainer" containerID="d46fd5e887baec843bdd4f9f0254772bf4dc50323e052cd052dd2ea4657b7397" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.334076 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.335084 4870 scope.go:117] "RemoveContainer" containerID="a9347a512a592b79cb85be5a5a664bfadec21fed65bd7eacf1a97eb008166eb1" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.347532 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.359669 4870 scope.go:117] "RemoveContainer" containerID="ea102a1406731d57700d5196e250072c2053fa2345212a3d6975e629610cb94c" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.374863 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375297 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="config-reloader" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375314 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="config-reloader" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375326 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="thanos-sidecar" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375331 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="thanos-sidecar" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375341 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="prometheus" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375347 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="prometheus" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375372 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="extract-content" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375377 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="extract-content" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375392 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="registry-server" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375398 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="registry-server" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375408 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="extract-utilities" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375414 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="extract-utilities" Jan 30 08:56:28 crc kubenswrapper[4870]: E0130 08:56:28.375427 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="init-config-reloader" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375433 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="init-config-reloader" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375796 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="prometheus" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375822 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="thanos-sidecar" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375838 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b5a042-8bb0-474f-bc28-7d116341bf06" containerName="registry-server" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.375855 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" containerName="config-reloader" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.383788 4870 scope.go:117] "RemoveContainer" containerID="93e4e1345741b60dca904480e4327da8f596dec2a8d2178c87fe6d5632a2daeb" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.384382 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.388555 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.390258 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.390556 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.391016 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.391087 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-88lql" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.391220 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.391235 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.402606 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.414592 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445182 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445236 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445279 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445319 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445353 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445401 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445521 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwrq\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-kube-api-access-hrwrq\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445597 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445684 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445729 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445869 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445924 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.445953 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.547618 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548273 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548326 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548370 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548400 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548457 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548563 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwrq\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-kube-api-access-hrwrq\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548603 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548672 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548717 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548755 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548778 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.548801 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.549593 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.549593 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.549662 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1a4d5397-32f0-4cc0-919b-cf4ed004b797-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.553450 4870 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.553496 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b608408b27cf3925c08af2a9b3a133a2b5eb87db3a290a5641371b0533b7f7d2/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.554216 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.556140 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.558972 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.558997 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.559258 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d5397-32f0-4cc0-919b-cf4ed004b797-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.559479 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.562681 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.564024 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/1a4d5397-32f0-4cc0-919b-cf4ed004b797-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.574250 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwrq\" (UniqueName: \"kubernetes.io/projected/1a4d5397-32f0-4cc0-919b-cf4ed004b797-kube-api-access-hrwrq\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.620641 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-551e78ca-f27b-4988-a731-fdbb92ea32f1\") pod \"prometheus-metric-storage-0\" (UID: \"1a4d5397-32f0-4cc0-919b-cf4ed004b797\") " pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:28 crc kubenswrapper[4870]: I0130 08:56:28.762437 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:29 crc kubenswrapper[4870]: I0130 08:56:29.309658 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 08:56:29 crc kubenswrapper[4870]: I0130 08:56:29.346586 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerStarted","Data":"16cd24868de028fb744a58f52f28012a8226e6660a3e945e583d11350e5d9fa9"} Jan 30 08:56:30 crc kubenswrapper[4870]: I0130 08:56:30.087166 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8b2056-4db2-489e-b1d1-b201e38e84c8" path="/var/lib/kubelet/pods/8c8b2056-4db2-489e-b1d1-b201e38e84c8/volumes" Jan 30 08:56:33 crc kubenswrapper[4870]: I0130 08:56:33.385048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerStarted","Data":"9a9c2d1a2710347a06c38820d11f1b5b73a9bf551910bc03bc75d44b0d9cc52f"} Jan 30 08:56:41 crc kubenswrapper[4870]: I0130 08:56:41.461444 4870 generic.go:334] "Generic (PLEG): container finished" podID="1a4d5397-32f0-4cc0-919b-cf4ed004b797" containerID="9a9c2d1a2710347a06c38820d11f1b5b73a9bf551910bc03bc75d44b0d9cc52f" exitCode=0 Jan 30 08:56:41 crc kubenswrapper[4870]: I0130 08:56:41.461530 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerDied","Data":"9a9c2d1a2710347a06c38820d11f1b5b73a9bf551910bc03bc75d44b0d9cc52f"} Jan 30 08:56:42 crc kubenswrapper[4870]: I0130 08:56:42.474977 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerStarted","Data":"5e54e3c60810f07593c4f31dd1619612db1222c28b053ffcf6d3f53579eab58f"} Jan 30 08:56:45 crc kubenswrapper[4870]: I0130 08:56:45.513456 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerStarted","Data":"260cd41ef24f5ba7b55ab1319b66e0603fed3f8b96c623eb70572465578ecd8f"} Jan 30 08:56:46 crc kubenswrapper[4870]: I0130 08:56:46.525368 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a4d5397-32f0-4cc0-919b-cf4ed004b797","Type":"ContainerStarted","Data":"82a2bd08b9c97b2c84ab230c42a757060be2382a533996f15685f7dd8eda511d"} Jan 30 08:56:46 crc kubenswrapper[4870]: I0130 08:56:46.570171 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.570147523 podStartE2EDuration="18.570147523s" podCreationTimestamp="2026-01-30 08:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 08:56:46.550838119 +0000 UTC m=+2845.246385238" watchObservedRunningTime="2026-01-30 08:56:46.570147523 +0000 UTC m=+2845.265694642" Jan 30 08:56:48 crc kubenswrapper[4870]: I0130 08:56:48.762849 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:58 crc kubenswrapper[4870]: I0130 08:56:58.763406 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:58 crc kubenswrapper[4870]: I0130 08:56:58.772502 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 08:56:59 crc kubenswrapper[4870]: I0130 08:56:59.658018 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.562659 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.565185 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.567687 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.567904 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w7v26" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.568200 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.569112 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.587278 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.676622 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.676665 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.676734 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bggh\" (UniqueName: \"kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.676892 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.676986 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.677085 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.677204 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.677407 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.677458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.779527 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.779604 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.779671 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.779755 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780166 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780382 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780627 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780707 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780736 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780823 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bggh\" (UniqueName: \"kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.780875 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.781149 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.782043 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.782057 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.792622 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.793047 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.796739 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.801328 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bggh\" (UniqueName: \"kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.823749 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " pod="openstack/tempest-tests-tempest" Jan 30 08:57:14 crc kubenswrapper[4870]: I0130 08:57:14.884851 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 08:57:15 crc kubenswrapper[4870]: I0130 08:57:15.395319 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 08:57:15 crc kubenswrapper[4870]: I0130 08:57:15.799363 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a","Type":"ContainerStarted","Data":"1c881927627a156ba1416d85da9f209f5ec355b05e5dce2ac4e41aa800f2573b"} Jan 30 08:57:25 crc kubenswrapper[4870]: I0130 08:57:25.249732 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:57:25 crc kubenswrapper[4870]: I0130 08:57:25.251036 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:57:26 crc kubenswrapper[4870]: I0130 08:57:26.297923 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 08:57:27 crc kubenswrapper[4870]: I0130 08:57:27.933210 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a","Type":"ContainerStarted","Data":"e510327daa135710d56632aefcbd974a031585074a72c0b411cbaf1ee33eb7a9"} Jan 30 08:57:27 crc kubenswrapper[4870]: I0130 08:57:27.957480 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.061621058 podStartE2EDuration="14.957463224s" podCreationTimestamp="2026-01-30 08:57:13 +0000 UTC" firstStartedPulling="2026-01-30 08:57:15.399335637 +0000 UTC m=+2874.094882756" lastFinishedPulling="2026-01-30 08:57:26.295177813 +0000 UTC m=+2884.990724922" observedRunningTime="2026-01-30 08:57:27.946890494 +0000 UTC m=+2886.642437603" watchObservedRunningTime="2026-01-30 08:57:27.957463224 +0000 UTC m=+2886.653010333" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.808658 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.814484 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.834254 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.896428 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.896507 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.896585 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmq5v\" (UniqueName: \"kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.999010 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.999431 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmq5v\" (UniqueName: \"kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.999544 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:43 crc kubenswrapper[4870]: I0130 08:57:43.999574 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:44 crc kubenswrapper[4870]: I0130 08:57:43.999805 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:44 crc kubenswrapper[4870]: I0130 08:57:44.019235 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmq5v\" (UniqueName: \"kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v\") pod \"certified-operators-dk8db\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:44 crc kubenswrapper[4870]: I0130 08:57:44.142457 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:44 crc kubenswrapper[4870]: I0130 08:57:44.658011 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:44 crc kubenswrapper[4870]: W0130 08:57:44.659194 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a375fc2_49c4_42c7_a029_34fde5c159cf.slice/crio-8bc1dd9b3b99058b263a44cf32457ac4dd41def79ff1d253204239fb663b9df3 WatchSource:0}: Error finding container 8bc1dd9b3b99058b263a44cf32457ac4dd41def79ff1d253204239fb663b9df3: Status 404 returned error can't find the container with id 8bc1dd9b3b99058b263a44cf32457ac4dd41def79ff1d253204239fb663b9df3 Jan 30 08:57:45 crc kubenswrapper[4870]: I0130 08:57:45.591283 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerID="fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231" exitCode=0 Jan 30 08:57:45 crc kubenswrapper[4870]: I0130 08:57:45.591355 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerDied","Data":"fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231"} Jan 30 08:57:45 crc kubenswrapper[4870]: I0130 08:57:45.591564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerStarted","Data":"8bc1dd9b3b99058b263a44cf32457ac4dd41def79ff1d253204239fb663b9df3"} Jan 30 08:57:46 crc kubenswrapper[4870]: I0130 08:57:46.602652 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerStarted","Data":"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f"} Jan 30 08:57:48 crc kubenswrapper[4870]: I0130 08:57:48.626708 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerID="0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f" exitCode=0 Jan 30 08:57:48 crc kubenswrapper[4870]: I0130 08:57:48.626823 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerDied","Data":"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f"} Jan 30 08:57:48 crc kubenswrapper[4870]: I0130 08:57:48.629853 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 08:57:49 crc kubenswrapper[4870]: I0130 08:57:49.638445 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerStarted","Data":"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61"} Jan 30 08:57:49 crc kubenswrapper[4870]: I0130 08:57:49.665679 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dk8db" podStartSLOduration=3.266985549 podStartE2EDuration="6.665653427s" podCreationTimestamp="2026-01-30 08:57:43 +0000 UTC" firstStartedPulling="2026-01-30 08:57:45.593283024 +0000 UTC m=+2904.288830173" lastFinishedPulling="2026-01-30 08:57:48.991950942 +0000 UTC m=+2907.687498051" observedRunningTime="2026-01-30 08:57:49.657733329 +0000 UTC m=+2908.353280438" watchObservedRunningTime="2026-01-30 08:57:49.665653427 +0000 UTC m=+2908.361200536" Jan 30 08:57:54 crc kubenswrapper[4870]: I0130 08:57:54.142714 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:54 crc kubenswrapper[4870]: I0130 08:57:54.143331 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:54 crc kubenswrapper[4870]: I0130 08:57:54.202370 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:54 crc kubenswrapper[4870]: I0130 08:57:54.742041 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:54 crc kubenswrapper[4870]: I0130 08:57:54.819809 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:55 crc kubenswrapper[4870]: I0130 08:57:55.249402 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:57:55 crc kubenswrapper[4870]: I0130 08:57:55.249732 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:57:56 crc kubenswrapper[4870]: I0130 08:57:56.700453 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dk8db" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="registry-server" containerID="cri-o://4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61" gracePeriod=2 Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.219771 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.413616 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content\") pod \"5a375fc2-49c4-42c7-a029-34fde5c159cf\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.413701 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmq5v\" (UniqueName: \"kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v\") pod \"5a375fc2-49c4-42c7-a029-34fde5c159cf\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.413775 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities\") pod \"5a375fc2-49c4-42c7-a029-34fde5c159cf\" (UID: \"5a375fc2-49c4-42c7-a029-34fde5c159cf\") " Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.415011 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities" (OuterVolumeSpecName: "utilities") pod "5a375fc2-49c4-42c7-a029-34fde5c159cf" (UID: "5a375fc2-49c4-42c7-a029-34fde5c159cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.425104 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v" (OuterVolumeSpecName: "kube-api-access-gmq5v") pod "5a375fc2-49c4-42c7-a029-34fde5c159cf" (UID: "5a375fc2-49c4-42c7-a029-34fde5c159cf"). InnerVolumeSpecName "kube-api-access-gmq5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.497067 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a375fc2-49c4-42c7-a029-34fde5c159cf" (UID: "5a375fc2-49c4-42c7-a029-34fde5c159cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.517025 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.517072 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a375fc2-49c4-42c7-a029-34fde5c159cf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.517086 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmq5v\" (UniqueName: \"kubernetes.io/projected/5a375fc2-49c4-42c7-a029-34fde5c159cf-kube-api-access-gmq5v\") on node \"crc\" DevicePath \"\"" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.715629 4870 generic.go:334] "Generic (PLEG): container finished" podID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerID="4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61" exitCode=0 Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.715689 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk8db" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.715714 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerDied","Data":"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61"} Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.716022 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk8db" event={"ID":"5a375fc2-49c4-42c7-a029-34fde5c159cf","Type":"ContainerDied","Data":"8bc1dd9b3b99058b263a44cf32457ac4dd41def79ff1d253204239fb663b9df3"} Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.716043 4870 scope.go:117] "RemoveContainer" containerID="4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.751261 4870 scope.go:117] "RemoveContainer" containerID="0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.753558 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.771739 4870 scope.go:117] "RemoveContainer" containerID="fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.772128 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dk8db"] Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.834429 4870 scope.go:117] "RemoveContainer" containerID="4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61" Jan 30 08:57:57 crc kubenswrapper[4870]: E0130 08:57:57.835083 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61\": container with ID starting with 4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61 not found: ID does not exist" containerID="4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.835139 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61"} err="failed to get container status \"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61\": rpc error: code = NotFound desc = could not find container \"4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61\": container with ID starting with 4ddb7739156f016ea120f6a0b74e693c401cad10f668a738e8930df3886c0c61 not found: ID does not exist" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.835170 4870 scope.go:117] "RemoveContainer" containerID="0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f" Jan 30 08:57:57 crc kubenswrapper[4870]: E0130 08:57:57.835679 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f\": container with ID starting with 0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f not found: ID does not exist" containerID="0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.835729 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f"} err="failed to get container status \"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f\": rpc error: code = NotFound desc = could not find container \"0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f\": container with ID starting with 0dee4777b76a82b1fdc63504969da64f34a5856c1db03793c852224b9b42f73f not found: ID does not exist" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.835764 4870 scope.go:117] "RemoveContainer" containerID="fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231" Jan 30 08:57:57 crc kubenswrapper[4870]: E0130 08:57:57.836192 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231\": container with ID starting with fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231 not found: ID does not exist" containerID="fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231" Jan 30 08:57:57 crc kubenswrapper[4870]: I0130 08:57:57.836215 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231"} err="failed to get container status \"fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231\": rpc error: code = NotFound desc = could not find container \"fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231\": container with ID starting with fa7c57f7496863d99ccce13e5e7a680e7f7c52a9ee29c22ae217f26d0ec42231 not found: ID does not exist" Jan 30 08:57:58 crc kubenswrapper[4870]: I0130 08:57:58.087372 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" path="/var/lib/kubelet/pods/5a375fc2-49c4-42c7-a029-34fde5c159cf/volumes" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.249291 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.249786 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.249841 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.250721 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.250790 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" gracePeriod=600 Jan 30 08:58:25 crc kubenswrapper[4870]: E0130 08:58:25.383094 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.996726 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" exitCode=0 Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.996780 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283"} Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.996841 4870 scope.go:117] "RemoveContainer" containerID="9db7902fb455898f2a67824f5d2bac1880accc6a6ce6fbe42d5af520837903b4" Jan 30 08:58:25 crc kubenswrapper[4870]: I0130 08:58:25.997823 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:58:25 crc kubenswrapper[4870]: E0130 08:58:25.998519 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:58:39 crc kubenswrapper[4870]: I0130 08:58:39.075208 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:58:39 crc kubenswrapper[4870]: E0130 08:58:39.076327 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:58:50 crc kubenswrapper[4870]: I0130 08:58:50.074903 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:58:50 crc kubenswrapper[4870]: E0130 08:58:50.075965 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:59:03 crc kubenswrapper[4870]: I0130 08:59:03.075381 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:59:03 crc kubenswrapper[4870]: E0130 08:59:03.076272 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:59:17 crc kubenswrapper[4870]: I0130 08:59:17.075621 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:59:17 crc kubenswrapper[4870]: E0130 08:59:17.076807 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:59:30 crc kubenswrapper[4870]: I0130 08:59:30.074898 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:59:30 crc kubenswrapper[4870]: E0130 08:59:30.075676 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:59:41 crc kubenswrapper[4870]: I0130 08:59:41.075469 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:59:41 crc kubenswrapper[4870]: E0130 08:59:41.076768 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 08:59:53 crc kubenswrapper[4870]: I0130 08:59:53.075146 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 08:59:53 crc kubenswrapper[4870]: E0130 08:59:53.076200 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.151056 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb"] Jan 30 09:00:00 crc kubenswrapper[4870]: E0130 09:00:00.152166 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="extract-utilities" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.152186 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="extract-utilities" Jan 30 09:00:00 crc kubenswrapper[4870]: E0130 09:00:00.152214 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="registry-server" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.152222 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="registry-server" Jan 30 09:00:00 crc kubenswrapper[4870]: E0130 09:00:00.152248 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="extract-content" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.152257 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="extract-content" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.152505 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a375fc2-49c4-42c7-a029-34fde5c159cf" containerName="registry-server" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.155010 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.161289 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.161289 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.182633 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb"] Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.218216 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.218348 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp7jx\" (UniqueName: \"kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.218499 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.320593 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.320657 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp7jx\" (UniqueName: \"kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.320779 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.321560 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.328057 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.342576 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp7jx\" (UniqueName: \"kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx\") pod \"collect-profiles-29496060-gq9mb\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.482226 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.924424 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb"] Jan 30 09:00:00 crc kubenswrapper[4870]: I0130 09:00:00.954553 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" event={"ID":"f537a705-b98d-4cc1-8fba-f9fb4145fc33","Type":"ContainerStarted","Data":"8d7b9cb36aff363a87a4e3e13e6a4a3eb2da89546a6ff8ab278af0d598dd103b"} Jan 30 09:00:01 crc kubenswrapper[4870]: I0130 09:00:01.965579 4870 generic.go:334] "Generic (PLEG): container finished" podID="f537a705-b98d-4cc1-8fba-f9fb4145fc33" containerID="21570fbd391aa6805bfee83f36df9ca917daf03782908d47cd7dd4eedf90e176" exitCode=0 Jan 30 09:00:01 crc kubenswrapper[4870]: I0130 09:00:01.965744 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" event={"ID":"f537a705-b98d-4cc1-8fba-f9fb4145fc33","Type":"ContainerDied","Data":"21570fbd391aa6805bfee83f36df9ca917daf03782908d47cd7dd4eedf90e176"} Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.367269 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.390545 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume\") pod \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.390584 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume\") pod \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.390698 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp7jx\" (UniqueName: \"kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx\") pod \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\" (UID: \"f537a705-b98d-4cc1-8fba-f9fb4145fc33\") " Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.391456 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume" (OuterVolumeSpecName: "config-volume") pod "f537a705-b98d-4cc1-8fba-f9fb4145fc33" (UID: "f537a705-b98d-4cc1-8fba-f9fb4145fc33"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.395996 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx" (OuterVolumeSpecName: "kube-api-access-xp7jx") pod "f537a705-b98d-4cc1-8fba-f9fb4145fc33" (UID: "f537a705-b98d-4cc1-8fba-f9fb4145fc33"). InnerVolumeSpecName "kube-api-access-xp7jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.404020 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f537a705-b98d-4cc1-8fba-f9fb4145fc33" (UID: "f537a705-b98d-4cc1-8fba-f9fb4145fc33"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.492995 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f537a705-b98d-4cc1-8fba-f9fb4145fc33-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.493031 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f537a705-b98d-4cc1-8fba-f9fb4145fc33-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.493042 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp7jx\" (UniqueName: \"kubernetes.io/projected/f537a705-b98d-4cc1-8fba-f9fb4145fc33-kube-api-access-xp7jx\") on node \"crc\" DevicePath \"\"" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.989292 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" event={"ID":"f537a705-b98d-4cc1-8fba-f9fb4145fc33","Type":"ContainerDied","Data":"8d7b9cb36aff363a87a4e3e13e6a4a3eb2da89546a6ff8ab278af0d598dd103b"} Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.989673 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7b9cb36aff363a87a4e3e13e6a4a3eb2da89546a6ff8ab278af0d598dd103b" Jan 30 09:00:03 crc kubenswrapper[4870]: I0130 09:00:03.989417 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb" Jan 30 09:00:04 crc kubenswrapper[4870]: I0130 09:00:04.457011 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc"] Jan 30 09:00:04 crc kubenswrapper[4870]: I0130 09:00:04.473073 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496015-h7vrc"] Jan 30 09:00:06 crc kubenswrapper[4870]: I0130 09:00:06.096313 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ecc6b5-f0f3-40b1-ba86-24eabdbdc409" path="/var/lib/kubelet/pods/84ecc6b5-f0f3-40b1-ba86-24eabdbdc409/volumes" Jan 30 09:00:08 crc kubenswrapper[4870]: I0130 09:00:08.074742 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:00:08 crc kubenswrapper[4870]: E0130 09:00:08.075725 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:00:20 crc kubenswrapper[4870]: I0130 09:00:20.075509 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:00:20 crc kubenswrapper[4870]: E0130 09:00:20.077679 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:00:33 crc kubenswrapper[4870]: I0130 09:00:33.074695 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:00:33 crc kubenswrapper[4870]: E0130 09:00:33.075520 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:00:38 crc kubenswrapper[4870]: I0130 09:00:38.299242 4870 scope.go:117] "RemoveContainer" containerID="906fa4603bfe71976f941c25c726c6a5f3b1b9c0bede621580c2910f359fd6f2" Jan 30 09:00:44 crc kubenswrapper[4870]: I0130 09:00:44.075326 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:00:44 crc kubenswrapper[4870]: E0130 09:00:44.076277 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:00:58 crc kubenswrapper[4870]: I0130 09:00:58.075159 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:00:58 crc kubenswrapper[4870]: E0130 09:00:58.076418 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.167872 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29496061-tjh7b"] Jan 30 09:01:00 crc kubenswrapper[4870]: E0130 09:01:00.168639 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f537a705-b98d-4cc1-8fba-f9fb4145fc33" containerName="collect-profiles" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.168652 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="f537a705-b98d-4cc1-8fba-f9fb4145fc33" containerName="collect-profiles" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.168861 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="f537a705-b98d-4cc1-8fba-f9fb4145fc33" containerName="collect-profiles" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.170037 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.189058 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496061-tjh7b"] Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.318361 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2p4\" (UniqueName: \"kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.318419 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.318488 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.318515 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.420978 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2p4\" (UniqueName: \"kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.421046 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.421106 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.421127 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.429657 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.434869 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.436589 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.445000 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2p4\" (UniqueName: \"kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4\") pod \"keystone-cron-29496061-tjh7b\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:00 crc kubenswrapper[4870]: I0130 09:01:00.501439 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:01 crc kubenswrapper[4870]: I0130 09:01:01.017250 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496061-tjh7b"] Jan 30 09:01:01 crc kubenswrapper[4870]: I0130 09:01:01.492977 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496061-tjh7b" event={"ID":"43a9af69-f9ef-444e-8505-ccf1eac1a036","Type":"ContainerStarted","Data":"015be41b58e089227cb06e61cbafc7f719a04446c8960392bdc84dfdeaa2514b"} Jan 30 09:01:02 crc kubenswrapper[4870]: I0130 09:01:02.502239 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496061-tjh7b" event={"ID":"43a9af69-f9ef-444e-8505-ccf1eac1a036","Type":"ContainerStarted","Data":"b6529962890f5d75e098916eb17988c45823be9141cadf7c34ba6541efc047f6"} Jan 30 09:01:02 crc kubenswrapper[4870]: I0130 09:01:02.529036 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29496061-tjh7b" podStartSLOduration=2.529010326 podStartE2EDuration="2.529010326s" podCreationTimestamp="2026-01-30 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 09:01:02.51599526 +0000 UTC m=+3101.211542369" watchObservedRunningTime="2026-01-30 09:01:02.529010326 +0000 UTC m=+3101.224557435" Jan 30 09:01:06 crc kubenswrapper[4870]: I0130 09:01:06.539382 4870 generic.go:334] "Generic (PLEG): container finished" podID="43a9af69-f9ef-444e-8505-ccf1eac1a036" containerID="b6529962890f5d75e098916eb17988c45823be9141cadf7c34ba6541efc047f6" exitCode=0 Jan 30 09:01:06 crc kubenswrapper[4870]: I0130 09:01:06.539481 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496061-tjh7b" event={"ID":"43a9af69-f9ef-444e-8505-ccf1eac1a036","Type":"ContainerDied","Data":"b6529962890f5d75e098916eb17988c45823be9141cadf7c34ba6541efc047f6"} Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.003835 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.110991 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle\") pod \"43a9af69-f9ef-444e-8505-ccf1eac1a036\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.111191 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data\") pod \"43a9af69-f9ef-444e-8505-ccf1eac1a036\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.111324 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2p4\" (UniqueName: \"kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4\") pod \"43a9af69-f9ef-444e-8505-ccf1eac1a036\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.111382 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys\") pod \"43a9af69-f9ef-444e-8505-ccf1eac1a036\" (UID: \"43a9af69-f9ef-444e-8505-ccf1eac1a036\") " Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.118398 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43a9af69-f9ef-444e-8505-ccf1eac1a036" (UID: "43a9af69-f9ef-444e-8505-ccf1eac1a036"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.119317 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4" (OuterVolumeSpecName: "kube-api-access-5s2p4") pod "43a9af69-f9ef-444e-8505-ccf1eac1a036" (UID: "43a9af69-f9ef-444e-8505-ccf1eac1a036"). InnerVolumeSpecName "kube-api-access-5s2p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.145795 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43a9af69-f9ef-444e-8505-ccf1eac1a036" (UID: "43a9af69-f9ef-444e-8505-ccf1eac1a036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.178923 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data" (OuterVolumeSpecName: "config-data") pod "43a9af69-f9ef-444e-8505-ccf1eac1a036" (UID: "43a9af69-f9ef-444e-8505-ccf1eac1a036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.213685 4870 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.213724 4870 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.213737 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a9af69-f9ef-444e-8505-ccf1eac1a036-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.213748 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2p4\" (UniqueName: \"kubernetes.io/projected/43a9af69-f9ef-444e-8505-ccf1eac1a036-kube-api-access-5s2p4\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.560535 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496061-tjh7b" event={"ID":"43a9af69-f9ef-444e-8505-ccf1eac1a036","Type":"ContainerDied","Data":"015be41b58e089227cb06e61cbafc7f719a04446c8960392bdc84dfdeaa2514b"} Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.560592 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496061-tjh7b" Jan 30 09:01:08 crc kubenswrapper[4870]: I0130 09:01:08.560600 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="015be41b58e089227cb06e61cbafc7f719a04446c8960392bdc84dfdeaa2514b" Jan 30 09:01:09 crc kubenswrapper[4870]: I0130 09:01:09.076369 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:01:09 crc kubenswrapper[4870]: E0130 09:01:09.076688 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.842929 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:15 crc kubenswrapper[4870]: E0130 09:01:15.843868 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a9af69-f9ef-444e-8505-ccf1eac1a036" containerName="keystone-cron" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.843954 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a9af69-f9ef-444e-8505-ccf1eac1a036" containerName="keystone-cron" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.844297 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a9af69-f9ef-444e-8505-ccf1eac1a036" containerName="keystone-cron" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.846141 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.856743 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.994739 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.994825 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:15 crc kubenswrapper[4870]: I0130 09:01:15.995148 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqjf\" (UniqueName: \"kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.096870 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.096939 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.097051 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqjf\" (UniqueName: \"kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.097358 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.097432 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.129024 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqjf\" (UniqueName: \"kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf\") pod \"redhat-marketplace-9wkl4\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.186992 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:16 crc kubenswrapper[4870]: I0130 09:01:16.703911 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:17 crc kubenswrapper[4870]: I0130 09:01:17.650484 4870 generic.go:334] "Generic (PLEG): container finished" podID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerID="2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74" exitCode=0 Jan 30 09:01:17 crc kubenswrapper[4870]: I0130 09:01:17.650581 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerDied","Data":"2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74"} Jan 30 09:01:17 crc kubenswrapper[4870]: I0130 09:01:17.650844 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerStarted","Data":"4f6eed6a15d8474e632789a235f8b8fe26a34b14f77f04f0be67129c66a15005"} Jan 30 09:01:20 crc kubenswrapper[4870]: I0130 09:01:20.681934 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerStarted","Data":"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae"} Jan 30 09:01:21 crc kubenswrapper[4870]: I0130 09:01:21.074852 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:01:21 crc kubenswrapper[4870]: E0130 09:01:21.075196 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:01:23 crc kubenswrapper[4870]: I0130 09:01:23.712418 4870 generic.go:334] "Generic (PLEG): container finished" podID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerID="7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae" exitCode=0 Jan 30 09:01:23 crc kubenswrapper[4870]: I0130 09:01:23.712499 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerDied","Data":"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae"} Jan 30 09:01:25 crc kubenswrapper[4870]: I0130 09:01:25.738616 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerStarted","Data":"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1"} Jan 30 09:01:25 crc kubenswrapper[4870]: I0130 09:01:25.770864 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9wkl4" podStartSLOduration=3.766599748 podStartE2EDuration="10.77084499s" podCreationTimestamp="2026-01-30 09:01:15 +0000 UTC" firstStartedPulling="2026-01-30 09:01:17.652059515 +0000 UTC m=+3116.347606624" lastFinishedPulling="2026-01-30 09:01:24.656304757 +0000 UTC m=+3123.351851866" observedRunningTime="2026-01-30 09:01:25.759517986 +0000 UTC m=+3124.455065105" watchObservedRunningTime="2026-01-30 09:01:25.77084499 +0000 UTC m=+3124.466392099" Jan 30 09:01:26 crc kubenswrapper[4870]: I0130 09:01:26.188075 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:26 crc kubenswrapper[4870]: I0130 09:01:26.188204 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:27 crc kubenswrapper[4870]: I0130 09:01:27.232800 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9wkl4" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="registry-server" probeResult="failure" output=< Jan 30 09:01:27 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:01:27 crc kubenswrapper[4870]: > Jan 30 09:01:34 crc kubenswrapper[4870]: I0130 09:01:34.075214 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:01:34 crc kubenswrapper[4870]: E0130 09:01:34.076169 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:01:36 crc kubenswrapper[4870]: I0130 09:01:36.237328 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:36 crc kubenswrapper[4870]: I0130 09:01:36.286535 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:36 crc kubenswrapper[4870]: I0130 09:01:36.473647 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:37 crc kubenswrapper[4870]: I0130 09:01:37.851989 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9wkl4" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="registry-server" containerID="cri-o://15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1" gracePeriod=2 Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.331563 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.475353 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kqjf\" (UniqueName: \"kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf\") pod \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.475628 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities\") pod \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.475714 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content\") pod \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\" (UID: \"a3785ae9-ea5a-4e63-99b5-e2f370f32739\") " Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.478929 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities" (OuterVolumeSpecName: "utilities") pod "a3785ae9-ea5a-4e63-99b5-e2f370f32739" (UID: "a3785ae9-ea5a-4e63-99b5-e2f370f32739"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.485131 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf" (OuterVolumeSpecName: "kube-api-access-5kqjf") pod "a3785ae9-ea5a-4e63-99b5-e2f370f32739" (UID: "a3785ae9-ea5a-4e63-99b5-e2f370f32739"). InnerVolumeSpecName "kube-api-access-5kqjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.501140 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3785ae9-ea5a-4e63-99b5-e2f370f32739" (UID: "a3785ae9-ea5a-4e63-99b5-e2f370f32739"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.578056 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.578093 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kqjf\" (UniqueName: \"kubernetes.io/projected/a3785ae9-ea5a-4e63-99b5-e2f370f32739-kube-api-access-5kqjf\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.578105 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3785ae9-ea5a-4e63-99b5-e2f370f32739-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.862272 4870 generic.go:334] "Generic (PLEG): container finished" podID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerID="15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1" exitCode=0 Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.862351 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerDied","Data":"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1"} Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.862361 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wkl4" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.862688 4870 scope.go:117] "RemoveContainer" containerID="15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.862667 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wkl4" event={"ID":"a3785ae9-ea5a-4e63-99b5-e2f370f32739","Type":"ContainerDied","Data":"4f6eed6a15d8474e632789a235f8b8fe26a34b14f77f04f0be67129c66a15005"} Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.892571 4870 scope.go:117] "RemoveContainer" containerID="7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.909607 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.918446 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wkl4"] Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.922074 4870 scope.go:117] "RemoveContainer" containerID="2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.995234 4870 scope.go:117] "RemoveContainer" containerID="15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1" Jan 30 09:01:38 crc kubenswrapper[4870]: E0130 09:01:38.995704 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1\": container with ID starting with 15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1 not found: ID does not exist" containerID="15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.995754 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1"} err="failed to get container status \"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1\": rpc error: code = NotFound desc = could not find container \"15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1\": container with ID starting with 15526d903032abe561dc01c511ea0d161fcbbc3cf7983a924b78c8907d98e7d1 not found: ID does not exist" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.995784 4870 scope.go:117] "RemoveContainer" containerID="7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae" Jan 30 09:01:38 crc kubenswrapper[4870]: E0130 09:01:38.996211 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae\": container with ID starting with 7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae not found: ID does not exist" containerID="7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.996302 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae"} err="failed to get container status \"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae\": rpc error: code = NotFound desc = could not find container \"7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae\": container with ID starting with 7e3339d6ec35a3b27f4157c7f3cf254ce64e449a9490f4aed5dffbc96defaeae not found: ID does not exist" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.996366 4870 scope.go:117] "RemoveContainer" containerID="2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74" Jan 30 09:01:38 crc kubenswrapper[4870]: E0130 09:01:38.996648 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74\": container with ID starting with 2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74 not found: ID does not exist" containerID="2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74" Jan 30 09:01:38 crc kubenswrapper[4870]: I0130 09:01:38.996719 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74"} err="failed to get container status \"2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74\": rpc error: code = NotFound desc = could not find container \"2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74\": container with ID starting with 2c5fa9920d90bab0f1d5ce6bf20404df2d8adf83c6aa4e90b8ddd69a895a4a74 not found: ID does not exist" Jan 30 09:01:40 crc kubenswrapper[4870]: I0130 09:01:40.089696 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" path="/var/lib/kubelet/pods/a3785ae9-ea5a-4e63-99b5-e2f370f32739/volumes" Jan 30 09:01:45 crc kubenswrapper[4870]: I0130 09:01:45.076064 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:01:45 crc kubenswrapper[4870]: E0130 09:01:45.077006 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:01:58 crc kubenswrapper[4870]: I0130 09:01:58.075552 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:01:58 crc kubenswrapper[4870]: E0130 09:01:58.076331 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:02:11 crc kubenswrapper[4870]: I0130 09:02:11.075179 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:02:11 crc kubenswrapper[4870]: E0130 09:02:11.076064 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:02:23 crc kubenswrapper[4870]: I0130 09:02:23.075111 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:02:23 crc kubenswrapper[4870]: E0130 09:02:23.075918 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:02:34 crc kubenswrapper[4870]: I0130 09:02:34.075236 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:02:34 crc kubenswrapper[4870]: E0130 09:02:34.077070 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:02:49 crc kubenswrapper[4870]: I0130 09:02:49.075312 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:02:49 crc kubenswrapper[4870]: E0130 09:02:49.076665 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:03:01 crc kubenswrapper[4870]: I0130 09:03:01.074900 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:03:01 crc kubenswrapper[4870]: E0130 09:03:01.075587 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:03:16 crc kubenswrapper[4870]: I0130 09:03:16.074772 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:03:16 crc kubenswrapper[4870]: E0130 09:03:16.075701 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:03:28 crc kubenswrapper[4870]: I0130 09:03:28.075587 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:03:28 crc kubenswrapper[4870]: I0130 09:03:28.878709 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271"} Jan 30 09:05:55 crc kubenswrapper[4870]: I0130 09:05:55.250109 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:05:55 crc kubenswrapper[4870]: I0130 09:05:55.250570 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.013758 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:18 crc kubenswrapper[4870]: E0130 09:06:18.015368 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="extract-utilities" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.015405 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="extract-utilities" Jan 30 09:06:18 crc kubenswrapper[4870]: E0130 09:06:18.015446 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="extract-content" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.015462 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="extract-content" Jan 30 09:06:18 crc kubenswrapper[4870]: E0130 09:06:18.015515 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="registry-server" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.015530 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="registry-server" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.016015 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3785ae9-ea5a-4e63-99b5-e2f370f32739" containerName="registry-server" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.019258 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.024469 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.146592 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.146737 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wdqq\" (UniqueName: \"kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.147125 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.249816 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.250056 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.250097 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wdqq\" (UniqueName: \"kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.250455 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.250500 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.276146 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wdqq\" (UniqueName: \"kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq\") pod \"redhat-operators-snfnh\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.365593 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:18 crc kubenswrapper[4870]: I0130 09:06:18.861381 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:19 crc kubenswrapper[4870]: I0130 09:06:19.626511 4870 generic.go:334] "Generic (PLEG): container finished" podID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerID="e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734" exitCode=0 Jan 30 09:06:19 crc kubenswrapper[4870]: I0130 09:06:19.626577 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerDied","Data":"e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734"} Jan 30 09:06:19 crc kubenswrapper[4870]: I0130 09:06:19.627150 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerStarted","Data":"3dc7547cf515e9f204ef80e6b218eae2f4161ef84d86bd1dd66e594de10b3bf7"} Jan 30 09:06:19 crc kubenswrapper[4870]: I0130 09:06:19.630734 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:06:21 crc kubenswrapper[4870]: I0130 09:06:21.655514 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerStarted","Data":"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622"} Jan 30 09:06:25 crc kubenswrapper[4870]: I0130 09:06:25.250220 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:06:25 crc kubenswrapper[4870]: I0130 09:06:25.250841 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:06:26 crc kubenswrapper[4870]: I0130 09:06:26.711929 4870 generic.go:334] "Generic (PLEG): container finished" podID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerID="02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622" exitCode=0 Jan 30 09:06:26 crc kubenswrapper[4870]: I0130 09:06:26.711978 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerDied","Data":"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622"} Jan 30 09:06:27 crc kubenswrapper[4870]: I0130 09:06:27.722459 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerStarted","Data":"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86"} Jan 30 09:06:27 crc kubenswrapper[4870]: I0130 09:06:27.747996 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snfnh" podStartSLOduration=3.265277798 podStartE2EDuration="10.74797255s" podCreationTimestamp="2026-01-30 09:06:17 +0000 UTC" firstStartedPulling="2026-01-30 09:06:19.630406482 +0000 UTC m=+3418.325953591" lastFinishedPulling="2026-01-30 09:06:27.113101224 +0000 UTC m=+3425.808648343" observedRunningTime="2026-01-30 09:06:27.742052195 +0000 UTC m=+3426.437599314" watchObservedRunningTime="2026-01-30 09:06:27.74797255 +0000 UTC m=+3426.443519669" Jan 30 09:06:28 crc kubenswrapper[4870]: I0130 09:06:28.366186 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:28 crc kubenswrapper[4870]: I0130 09:06:28.366237 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:29 crc kubenswrapper[4870]: I0130 09:06:29.411965 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snfnh" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="registry-server" probeResult="failure" output=< Jan 30 09:06:29 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:06:29 crc kubenswrapper[4870]: > Jan 30 09:06:38 crc kubenswrapper[4870]: I0130 09:06:38.418730 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:38 crc kubenswrapper[4870]: I0130 09:06:38.469168 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:38 crc kubenswrapper[4870]: I0130 09:06:38.659805 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:39 crc kubenswrapper[4870]: I0130 09:06:39.844506 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snfnh" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="registry-server" containerID="cri-o://6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86" gracePeriod=2 Jan 30 09:06:40 crc kubenswrapper[4870]: E0130 09:06:40.046727 4870 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8d2db0_5110_4134_ab7a_df0a03ec80b4.slice/crio-6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8d2db0_5110_4134_ab7a_df0a03ec80b4.slice/crio-conmon-6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86.scope\": RecentStats: unable to find data in memory cache]" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.412451 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.549507 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities\") pod \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.550024 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content\") pod \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.550170 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wdqq\" (UniqueName: \"kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq\") pod \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\" (UID: \"bb8d2db0-5110-4134-ab7a-df0a03ec80b4\") " Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.550114 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities" (OuterVolumeSpecName: "utilities") pod "bb8d2db0-5110-4134-ab7a-df0a03ec80b4" (UID: "bb8d2db0-5110-4134-ab7a-df0a03ec80b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.550858 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.567721 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq" (OuterVolumeSpecName: "kube-api-access-2wdqq") pod "bb8d2db0-5110-4134-ab7a-df0a03ec80b4" (UID: "bb8d2db0-5110-4134-ab7a-df0a03ec80b4"). InnerVolumeSpecName "kube-api-access-2wdqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.653333 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wdqq\" (UniqueName: \"kubernetes.io/projected/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-kube-api-access-2wdqq\") on node \"crc\" DevicePath \"\"" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.667142 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb8d2db0-5110-4134-ab7a-df0a03ec80b4" (UID: "bb8d2db0-5110-4134-ab7a-df0a03ec80b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.755237 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb8d2db0-5110-4134-ab7a-df0a03ec80b4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.859729 4870 generic.go:334] "Generic (PLEG): container finished" podID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerID="6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86" exitCode=0 Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.859774 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerDied","Data":"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86"} Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.859814 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snfnh" event={"ID":"bb8d2db0-5110-4134-ab7a-df0a03ec80b4","Type":"ContainerDied","Data":"3dc7547cf515e9f204ef80e6b218eae2f4161ef84d86bd1dd66e594de10b3bf7"} Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.859833 4870 scope.go:117] "RemoveContainer" containerID="6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.859920 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snfnh" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.889607 4870 scope.go:117] "RemoveContainer" containerID="02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.919921 4870 scope.go:117] "RemoveContainer" containerID="e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734" Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.940735 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:40 crc kubenswrapper[4870]: I0130 09:06:40.959728 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snfnh"] Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.014429 4870 scope.go:117] "RemoveContainer" containerID="6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86" Jan 30 09:06:41 crc kubenswrapper[4870]: E0130 09:06:41.015078 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86\": container with ID starting with 6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86 not found: ID does not exist" containerID="6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86" Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.015123 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86"} err="failed to get container status \"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86\": rpc error: code = NotFound desc = could not find container \"6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86\": container with ID starting with 6c012123932b91c605aa1585fbbe354ff2ad5cf49af930b09a080e233dd68a86 not found: ID does not exist" Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.015153 4870 scope.go:117] "RemoveContainer" containerID="02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622" Jan 30 09:06:41 crc kubenswrapper[4870]: E0130 09:06:41.017106 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622\": container with ID starting with 02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622 not found: ID does not exist" containerID="02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622" Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.017197 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622"} err="failed to get container status \"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622\": rpc error: code = NotFound desc = could not find container \"02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622\": container with ID starting with 02481fea632748099b50b10076988a482fbae35ef176cc74caea7cfbad739622 not found: ID does not exist" Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.017265 4870 scope.go:117] "RemoveContainer" containerID="e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734" Jan 30 09:06:41 crc kubenswrapper[4870]: E0130 09:06:41.019580 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734\": container with ID starting with e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734 not found: ID does not exist" containerID="e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734" Jan 30 09:06:41 crc kubenswrapper[4870]: I0130 09:06:41.019632 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734"} err="failed to get container status \"e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734\": rpc error: code = NotFound desc = could not find container \"e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734\": container with ID starting with e4308c32c5a5a547b265ce131e450c23aa0c352d28f57d7cd2592d8609453734 not found: ID does not exist" Jan 30 09:06:42 crc kubenswrapper[4870]: I0130 09:06:42.090106 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" path="/var/lib/kubelet/pods/bb8d2db0-5110-4134-ab7a-df0a03ec80b4/volumes" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.337499 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:06:49 crc kubenswrapper[4870]: E0130 09:06:49.338988 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="extract-content" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.339012 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="extract-content" Jan 30 09:06:49 crc kubenswrapper[4870]: E0130 09:06:49.339044 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="extract-utilities" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.339056 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="extract-utilities" Jan 30 09:06:49 crc kubenswrapper[4870]: E0130 09:06:49.339097 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="registry-server" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.339108 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="registry-server" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.339437 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8d2db0-5110-4134-ab7a-df0a03ec80b4" containerName="registry-server" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.341951 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.355111 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.445064 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99pl2\" (UniqueName: \"kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.445118 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.445801 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.547751 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.547821 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99pl2\" (UniqueName: \"kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.547848 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.548343 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.548372 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.574561 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99pl2\" (UniqueName: \"kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2\") pod \"community-operators-lvc4n\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:49 crc kubenswrapper[4870]: I0130 09:06:49.687509 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:50 crc kubenswrapper[4870]: I0130 09:06:50.267898 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:06:50 crc kubenswrapper[4870]: I0130 09:06:50.957087 4870 generic.go:334] "Generic (PLEG): container finished" podID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerID="dfc681f06958a7052b07273f65f791c31b68f37949a2c90c717d551f99d10c11" exitCode=0 Jan 30 09:06:50 crc kubenswrapper[4870]: I0130 09:06:50.957244 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerDied","Data":"dfc681f06958a7052b07273f65f791c31b68f37949a2c90c717d551f99d10c11"} Jan 30 09:06:50 crc kubenswrapper[4870]: I0130 09:06:50.957469 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerStarted","Data":"2b3d03bb0205d3f076d27bc63ac73956f2d285d0e523e900185a3e770c082a31"} Jan 30 09:06:51 crc kubenswrapper[4870]: I0130 09:06:51.967420 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerStarted","Data":"ab75b70c52c6135527b84b8c4e2005f9cc37ba58c868e52dc9783f77ed7c5b41"} Jan 30 09:06:53 crc kubenswrapper[4870]: I0130 09:06:53.984661 4870 generic.go:334] "Generic (PLEG): container finished" podID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerID="ab75b70c52c6135527b84b8c4e2005f9cc37ba58c868e52dc9783f77ed7c5b41" exitCode=0 Jan 30 09:06:53 crc kubenswrapper[4870]: I0130 09:06:53.984718 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerDied","Data":"ab75b70c52c6135527b84b8c4e2005f9cc37ba58c868e52dc9783f77ed7c5b41"} Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.016272 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerStarted","Data":"313fd118fe8f2f8b51760cde13bda6c7424565b20047111d85a7a318625300f6"} Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.040099 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvc4n" podStartSLOduration=2.423203336 podStartE2EDuration="6.040074178s" podCreationTimestamp="2026-01-30 09:06:49 +0000 UTC" firstStartedPulling="2026-01-30 09:06:50.960612453 +0000 UTC m=+3449.656159562" lastFinishedPulling="2026-01-30 09:06:54.577483295 +0000 UTC m=+3453.273030404" observedRunningTime="2026-01-30 09:06:55.036264718 +0000 UTC m=+3453.731811827" watchObservedRunningTime="2026-01-30 09:06:55.040074178 +0000 UTC m=+3453.735621277" Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.249955 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.250020 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.250075 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.250938 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:06:55 crc kubenswrapper[4870]: I0130 09:06:55.251000 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271" gracePeriod=600 Jan 30 09:06:56 crc kubenswrapper[4870]: I0130 09:06:56.028217 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271" exitCode=0 Jan 30 09:06:56 crc kubenswrapper[4870]: I0130 09:06:56.028413 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271"} Jan 30 09:06:56 crc kubenswrapper[4870]: I0130 09:06:56.028670 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271"} Jan 30 09:06:56 crc kubenswrapper[4870]: I0130 09:06:56.028691 4870 scope.go:117] "RemoveContainer" containerID="166951655cf7460235835367ab57cecab9ac55ffa7af60737e9d2ae75e6d1283" Jan 30 09:06:59 crc kubenswrapper[4870]: I0130 09:06:59.687645 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:59 crc kubenswrapper[4870]: I0130 09:06:59.688259 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:06:59 crc kubenswrapper[4870]: I0130 09:06:59.751484 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:07:00 crc kubenswrapper[4870]: I0130 09:07:00.159313 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:07:00 crc kubenswrapper[4870]: I0130 09:07:00.212180 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:07:02 crc kubenswrapper[4870]: I0130 09:07:02.124045 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvc4n" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="registry-server" containerID="cri-o://313fd118fe8f2f8b51760cde13bda6c7424565b20047111d85a7a318625300f6" gracePeriod=2 Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.133736 4870 generic.go:334] "Generic (PLEG): container finished" podID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerID="313fd118fe8f2f8b51760cde13bda6c7424565b20047111d85a7a318625300f6" exitCode=0 Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.133812 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerDied","Data":"313fd118fe8f2f8b51760cde13bda6c7424565b20047111d85a7a318625300f6"} Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.251050 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.374220 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content\") pod \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.374395 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99pl2\" (UniqueName: \"kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2\") pod \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.374444 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities\") pod \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\" (UID: \"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5\") " Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.375258 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities" (OuterVolumeSpecName: "utilities") pod "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" (UID: "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.379973 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2" (OuterVolumeSpecName: "kube-api-access-99pl2") pod "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" (UID: "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5"). InnerVolumeSpecName "kube-api-access-99pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.439971 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" (UID: "79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.476225 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.476252 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99pl2\" (UniqueName: \"kubernetes.io/projected/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-kube-api-access-99pl2\") on node \"crc\" DevicePath \"\"" Jan 30 09:07:03 crc kubenswrapper[4870]: I0130 09:07:03.476265 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.150386 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvc4n" event={"ID":"79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5","Type":"ContainerDied","Data":"2b3d03bb0205d3f076d27bc63ac73956f2d285d0e523e900185a3e770c082a31"} Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.150443 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvc4n" Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.150801 4870 scope.go:117] "RemoveContainer" containerID="313fd118fe8f2f8b51760cde13bda6c7424565b20047111d85a7a318625300f6" Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.190849 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.195868 4870 scope.go:117] "RemoveContainer" containerID="ab75b70c52c6135527b84b8c4e2005f9cc37ba58c868e52dc9783f77ed7c5b41" Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.200813 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvc4n"] Jan 30 09:07:04 crc kubenswrapper[4870]: I0130 09:07:04.219137 4870 scope.go:117] "RemoveContainer" containerID="dfc681f06958a7052b07273f65f791c31b68f37949a2c90c717d551f99d10c11" Jan 30 09:07:06 crc kubenswrapper[4870]: I0130 09:07:06.091669 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" path="/var/lib/kubelet/pods/79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5/volumes" Jan 30 09:08:55 crc kubenswrapper[4870]: I0130 09:08:55.249338 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:08:55 crc kubenswrapper[4870]: I0130 09:08:55.249961 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:09:25 crc kubenswrapper[4870]: I0130 09:09:25.250031 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:09:25 crc kubenswrapper[4870]: I0130 09:09:25.250835 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.250248 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.250997 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.251078 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.252175 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.252273 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" gracePeriod=600 Jan 30 09:09:55 crc kubenswrapper[4870]: E0130 09:09:55.390015 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.938263 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" exitCode=0 Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.938332 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271"} Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.938607 4870 scope.go:117] "RemoveContainer" containerID="17820b765eafb63916466378e99044acfb43003b6255354ee74c2b5d6f218271" Jan 30 09:09:55 crc kubenswrapper[4870]: I0130 09:09:55.939214 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:09:55 crc kubenswrapper[4870]: E0130 09:09:55.939447 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:10:10 crc kubenswrapper[4870]: I0130 09:10:10.079100 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:10:10 crc kubenswrapper[4870]: E0130 09:10:10.080077 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:10:22 crc kubenswrapper[4870]: I0130 09:10:22.081236 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:10:22 crc kubenswrapper[4870]: E0130 09:10:22.082039 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:10:36 crc kubenswrapper[4870]: I0130 09:10:36.079759 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:10:36 crc kubenswrapper[4870]: E0130 09:10:36.084586 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:10:50 crc kubenswrapper[4870]: I0130 09:10:50.074579 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:10:50 crc kubenswrapper[4870]: E0130 09:10:50.075531 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:11:05 crc kubenswrapper[4870]: I0130 09:11:05.075201 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:11:05 crc kubenswrapper[4870]: E0130 09:11:05.076403 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:11:18 crc kubenswrapper[4870]: I0130 09:11:18.075204 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:11:18 crc kubenswrapper[4870]: E0130 09:11:18.076120 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:11:33 crc kubenswrapper[4870]: I0130 09:11:33.075006 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:11:33 crc kubenswrapper[4870]: E0130 09:11:33.075755 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.318167 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:34 crc kubenswrapper[4870]: E0130 09:11:34.319217 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="extract-utilities" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.319233 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="extract-utilities" Jan 30 09:11:34 crc kubenswrapper[4870]: E0130 09:11:34.319241 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="registry-server" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.319247 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="registry-server" Jan 30 09:11:34 crc kubenswrapper[4870]: E0130 09:11:34.319291 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="extract-content" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.319299 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="extract-content" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.319500 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a84ac4-c8e1-43b6-a53a-1402a9c2c1a5" containerName="registry-server" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.320833 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.344129 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.479586 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hshs\" (UniqueName: \"kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.479983 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.480148 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.582336 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hshs\" (UniqueName: \"kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.582462 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.582540 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.583029 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.583036 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.607973 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hshs\" (UniqueName: \"kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs\") pod \"redhat-marketplace-n2nf2\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:34 crc kubenswrapper[4870]: I0130 09:11:34.651826 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:35 crc kubenswrapper[4870]: I0130 09:11:35.153699 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:36 crc kubenswrapper[4870]: I0130 09:11:36.008677 4870 generic.go:334] "Generic (PLEG): container finished" podID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerID="43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77" exitCode=0 Jan 30 09:11:36 crc kubenswrapper[4870]: I0130 09:11:36.008736 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerDied","Data":"43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77"} Jan 30 09:11:36 crc kubenswrapper[4870]: I0130 09:11:36.009080 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerStarted","Data":"1d1de4ca13f9f6608f25d6cd27b3c679f2cf0a58ab225bd7e3be562c7a5733eb"} Jan 30 09:11:36 crc kubenswrapper[4870]: I0130 09:11:36.012560 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:11:38 crc kubenswrapper[4870]: I0130 09:11:38.034177 4870 generic.go:334] "Generic (PLEG): container finished" podID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerID="4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4" exitCode=0 Jan 30 09:11:38 crc kubenswrapper[4870]: I0130 09:11:38.034244 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerDied","Data":"4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4"} Jan 30 09:11:39 crc kubenswrapper[4870]: I0130 09:11:39.048779 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerStarted","Data":"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a"} Jan 30 09:11:39 crc kubenswrapper[4870]: I0130 09:11:39.082201 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n2nf2" podStartSLOduration=2.644069479 podStartE2EDuration="5.082176402s" podCreationTimestamp="2026-01-30 09:11:34 +0000 UTC" firstStartedPulling="2026-01-30 09:11:36.012269869 +0000 UTC m=+3734.707816978" lastFinishedPulling="2026-01-30 09:11:38.450376772 +0000 UTC m=+3737.145923901" observedRunningTime="2026-01-30 09:11:39.07377465 +0000 UTC m=+3737.769321769" watchObservedRunningTime="2026-01-30 09:11:39.082176402 +0000 UTC m=+3737.777723511" Jan 30 09:11:44 crc kubenswrapper[4870]: I0130 09:11:44.307683 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:11:44 crc kubenswrapper[4870]: E0130 09:11:44.309021 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:11:44 crc kubenswrapper[4870]: I0130 09:11:44.652643 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:44 crc kubenswrapper[4870]: I0130 09:11:44.652799 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:44 crc kubenswrapper[4870]: I0130 09:11:44.726739 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:45 crc kubenswrapper[4870]: I0130 09:11:45.387227 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:45 crc kubenswrapper[4870]: I0130 09:11:45.441506 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.351699 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n2nf2" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="registry-server" containerID="cri-o://8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a" gracePeriod=2 Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.843747 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.977068 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities\") pod \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.977309 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hshs\" (UniqueName: \"kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs\") pod \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.977416 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content\") pod \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\" (UID: \"0b0a02d2-bfb4-4725-838b-bda2924a28d6\") " Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.978814 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities" (OuterVolumeSpecName: "utilities") pod "0b0a02d2-bfb4-4725-838b-bda2924a28d6" (UID: "0b0a02d2-bfb4-4725-838b-bda2924a28d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:11:47 crc kubenswrapper[4870]: I0130 09:11:47.994887 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs" (OuterVolumeSpecName: "kube-api-access-8hshs") pod "0b0a02d2-bfb4-4725-838b-bda2924a28d6" (UID: "0b0a02d2-bfb4-4725-838b-bda2924a28d6"). InnerVolumeSpecName "kube-api-access-8hshs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.022370 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b0a02d2-bfb4-4725-838b-bda2924a28d6" (UID: "0b0a02d2-bfb4-4725-838b-bda2924a28d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.079304 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hshs\" (UniqueName: \"kubernetes.io/projected/0b0a02d2-bfb4-4725-838b-bda2924a28d6-kube-api-access-8hshs\") on node \"crc\" DevicePath \"\"" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.079340 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.079352 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b0a02d2-bfb4-4725-838b-bda2924a28d6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.364480 4870 generic.go:334] "Generic (PLEG): container finished" podID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerID="8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a" exitCode=0 Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.364526 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerDied","Data":"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a"} Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.364800 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n2nf2" event={"ID":"0b0a02d2-bfb4-4725-838b-bda2924a28d6","Type":"ContainerDied","Data":"1d1de4ca13f9f6608f25d6cd27b3c679f2cf0a58ab225bd7e3be562c7a5733eb"} Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.364823 4870 scope.go:117] "RemoveContainer" containerID="8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.364552 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n2nf2" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.399322 4870 scope.go:117] "RemoveContainer" containerID="4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.409603 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.418905 4870 scope.go:117] "RemoveContainer" containerID="43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.423667 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n2nf2"] Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.487655 4870 scope.go:117] "RemoveContainer" containerID="8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a" Jan 30 09:11:48 crc kubenswrapper[4870]: E0130 09:11:48.489308 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a\": container with ID starting with 8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a not found: ID does not exist" containerID="8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.489360 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a"} err="failed to get container status \"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a\": rpc error: code = NotFound desc = could not find container \"8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a\": container with ID starting with 8d3386c464d49e524eb2a7cdc9cea31342222d28508dcddb5b16f2efa933c76a not found: ID does not exist" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.489387 4870 scope.go:117] "RemoveContainer" containerID="4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4" Jan 30 09:11:48 crc kubenswrapper[4870]: E0130 09:11:48.489711 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4\": container with ID starting with 4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4 not found: ID does not exist" containerID="4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.489731 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4"} err="failed to get container status \"4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4\": rpc error: code = NotFound desc = could not find container \"4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4\": container with ID starting with 4593f224dcf2b275752cd45222193503c76f7026129b283d30cb9747bfbd58f4 not found: ID does not exist" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.489750 4870 scope.go:117] "RemoveContainer" containerID="43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77" Jan 30 09:11:48 crc kubenswrapper[4870]: E0130 09:11:48.489947 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77\": container with ID starting with 43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77 not found: ID does not exist" containerID="43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77" Jan 30 09:11:48 crc kubenswrapper[4870]: I0130 09:11:48.489969 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77"} err="failed to get container status \"43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77\": rpc error: code = NotFound desc = could not find container \"43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77\": container with ID starting with 43803059d2288b81f6f92667f5cc43f474d24e8dc56036acc253d6ea3bfd7b77 not found: ID does not exist" Jan 30 09:11:50 crc kubenswrapper[4870]: I0130 09:11:50.087651 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" path="/var/lib/kubelet/pods/0b0a02d2-bfb4-4725-838b-bda2924a28d6/volumes" Jan 30 09:11:55 crc kubenswrapper[4870]: I0130 09:11:55.075100 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:11:55 crc kubenswrapper[4870]: E0130 09:11:55.076542 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:12:07 crc kubenswrapper[4870]: I0130 09:12:07.075054 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:12:07 crc kubenswrapper[4870]: E0130 09:12:07.075926 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:12:19 crc kubenswrapper[4870]: I0130 09:12:19.075624 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:12:19 crc kubenswrapper[4870]: E0130 09:12:19.076325 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:12:32 crc kubenswrapper[4870]: I0130 09:12:32.080801 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:12:32 crc kubenswrapper[4870]: E0130 09:12:32.081574 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:12:44 crc kubenswrapper[4870]: I0130 09:12:44.075203 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:12:44 crc kubenswrapper[4870]: E0130 09:12:44.076224 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:12:56 crc kubenswrapper[4870]: I0130 09:12:56.075480 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:12:56 crc kubenswrapper[4870]: E0130 09:12:56.076784 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:13:11 crc kubenswrapper[4870]: I0130 09:13:11.074813 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:13:11 crc kubenswrapper[4870]: E0130 09:13:11.075680 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:13:25 crc kubenswrapper[4870]: I0130 09:13:25.075229 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:13:25 crc kubenswrapper[4870]: E0130 09:13:25.076385 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:13:40 crc kubenswrapper[4870]: I0130 09:13:40.074642 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:13:40 crc kubenswrapper[4870]: E0130 09:13:40.075572 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:13:51 crc kubenswrapper[4870]: I0130 09:13:51.074939 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:13:51 crc kubenswrapper[4870]: E0130 09:13:51.076277 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:14:03 crc kubenswrapper[4870]: I0130 09:14:03.074859 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:14:03 crc kubenswrapper[4870]: E0130 09:14:03.075791 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:14:18 crc kubenswrapper[4870]: I0130 09:14:18.074789 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:14:18 crc kubenswrapper[4870]: E0130 09:14:18.075635 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:14:30 crc kubenswrapper[4870]: I0130 09:14:30.075089 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:14:30 crc kubenswrapper[4870]: E0130 09:14:30.076219 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:14:43 crc kubenswrapper[4870]: I0130 09:14:43.074785 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:14:43 crc kubenswrapper[4870]: E0130 09:14:43.075634 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:14:58 crc kubenswrapper[4870]: I0130 09:14:58.074961 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:14:59 crc kubenswrapper[4870]: I0130 09:14:59.254241 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b"} Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.206379 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n"] Jan 30 09:15:00 crc kubenswrapper[4870]: E0130 09:15:00.207216 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="extract-content" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.207244 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="extract-content" Jan 30 09:15:00 crc kubenswrapper[4870]: E0130 09:15:00.207269 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="registry-server" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.207278 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="registry-server" Jan 30 09:15:00 crc kubenswrapper[4870]: E0130 09:15:00.207311 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="extract-utilities" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.207319 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="extract-utilities" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.207577 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0a02d2-bfb4-4725-838b-bda2924a28d6" containerName="registry-server" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.208473 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.210154 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.210791 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.227461 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n"] Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.280338 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5nn9\" (UniqueName: \"kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.280664 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.280774 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.382509 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5nn9\" (UniqueName: \"kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.382692 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.382755 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.383841 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.395178 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.402342 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5nn9\" (UniqueName: \"kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9\") pod \"collect-profiles-29496075-p689n\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:00 crc kubenswrapper[4870]: I0130 09:15:00.543927 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:01 crc kubenswrapper[4870]: I0130 09:15:01.032281 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n"] Jan 30 09:15:01 crc kubenswrapper[4870]: I0130 09:15:01.271582 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" event={"ID":"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f","Type":"ContainerStarted","Data":"57695176c047a4d2a3192f1a496290ea01944208e9a6861fc8400e87ad5e75a8"} Jan 30 09:15:01 crc kubenswrapper[4870]: I0130 09:15:01.271634 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" event={"ID":"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f","Type":"ContainerStarted","Data":"6de73acc6cb9cb438f39e404589a32d8071470536db0821999d3d4de93352672"} Jan 30 09:15:01 crc kubenswrapper[4870]: I0130 09:15:01.291006 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" podStartSLOduration=1.290982603 podStartE2EDuration="1.290982603s" podCreationTimestamp="2026-01-30 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 09:15:01.28608401 +0000 UTC m=+3939.981631129" watchObservedRunningTime="2026-01-30 09:15:01.290982603 +0000 UTC m=+3939.986529732" Jan 30 09:15:02 crc kubenswrapper[4870]: I0130 09:15:02.283212 4870 generic.go:334] "Generic (PLEG): container finished" podID="c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" containerID="57695176c047a4d2a3192f1a496290ea01944208e9a6861fc8400e87ad5e75a8" exitCode=0 Jan 30 09:15:02 crc kubenswrapper[4870]: I0130 09:15:02.283287 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" event={"ID":"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f","Type":"ContainerDied","Data":"57695176c047a4d2a3192f1a496290ea01944208e9a6861fc8400e87ad5e75a8"} Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.733906 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.864099 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume\") pod \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.864506 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume\") pod \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.865009 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" (UID: "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.865303 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5nn9\" (UniqueName: \"kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9\") pod \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\" (UID: \"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f\") " Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.866098 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.873757 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9" (OuterVolumeSpecName: "kube-api-access-v5nn9") pod "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" (UID: "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f"). InnerVolumeSpecName "kube-api-access-v5nn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.878230 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" (UID: "c87e8dab-4aab-4cd8-bcb4-29d45c0f884f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.967756 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5nn9\" (UniqueName: \"kubernetes.io/projected/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-kube-api-access-v5nn9\") on node \"crc\" DevicePath \"\"" Jan 30 09:15:03 crc kubenswrapper[4870]: I0130 09:15:03.967803 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c87e8dab-4aab-4cd8-bcb4-29d45c0f884f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:15:04 crc kubenswrapper[4870]: I0130 09:15:04.301523 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" event={"ID":"c87e8dab-4aab-4cd8-bcb4-29d45c0f884f","Type":"ContainerDied","Data":"6de73acc6cb9cb438f39e404589a32d8071470536db0821999d3d4de93352672"} Jan 30 09:15:04 crc kubenswrapper[4870]: I0130 09:15:04.301561 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6de73acc6cb9cb438f39e404589a32d8071470536db0821999d3d4de93352672" Jan 30 09:15:04 crc kubenswrapper[4870]: I0130 09:15:04.301583 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496075-p689n" Jan 30 09:15:04 crc kubenswrapper[4870]: I0130 09:15:04.378809 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c"] Jan 30 09:15:04 crc kubenswrapper[4870]: I0130 09:15:04.391380 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496030-smd5c"] Jan 30 09:15:06 crc kubenswrapper[4870]: I0130 09:15:06.089660 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ecb0f40-780e-4f90-84aa-17af92178d88" path="/var/lib/kubelet/pods/3ecb0f40-780e-4f90-84aa-17af92178d88/volumes" Jan 30 09:15:38 crc kubenswrapper[4870]: I0130 09:15:38.714206 4870 scope.go:117] "RemoveContainer" containerID="dad0dbfc8aebf8b014e37d2f50b6d2deebcdfeb8419d761ea14c44680273c1c3" Jan 30 09:16:20 crc kubenswrapper[4870]: I0130 09:16:20.939175 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:20 crc kubenswrapper[4870]: E0130 09:16:20.940039 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" containerName="collect-profiles" Jan 30 09:16:20 crc kubenswrapper[4870]: I0130 09:16:20.940053 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" containerName="collect-profiles" Jan 30 09:16:20 crc kubenswrapper[4870]: I0130 09:16:20.940266 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87e8dab-4aab-4cd8-bcb4-29d45c0f884f" containerName="collect-profiles" Jan 30 09:16:20 crc kubenswrapper[4870]: I0130 09:16:20.941599 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:20 crc kubenswrapper[4870]: I0130 09:16:20.968548 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.059599 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.059694 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.059763 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwkbc\" (UniqueName: \"kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.162640 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.162784 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.162929 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwkbc\" (UniqueName: \"kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.163188 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.163354 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.199592 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwkbc\" (UniqueName: \"kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc\") pod \"redhat-operators-dbjtr\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.279208 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:21 crc kubenswrapper[4870]: I0130 09:16:21.784688 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:22 crc kubenswrapper[4870]: I0130 09:16:22.128532 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerStarted","Data":"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b"} Jan 30 09:16:22 crc kubenswrapper[4870]: I0130 09:16:22.129647 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerStarted","Data":"1c7d72833670a470a189ef9ff7a4db35f8fb4ede35202afd047cf7958ef6ba71"} Jan 30 09:16:23 crc kubenswrapper[4870]: I0130 09:16:23.148561 4870 generic.go:334] "Generic (PLEG): container finished" podID="aad36608-62a8-434a-899e-2383285678ba" containerID="52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b" exitCode=0 Jan 30 09:16:23 crc kubenswrapper[4870]: I0130 09:16:23.148672 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerDied","Data":"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b"} Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.136858 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.140171 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.150827 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.339765 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.340154 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6k9s\" (UniqueName: \"kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.340334 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.442894 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.442962 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6k9s\" (UniqueName: \"kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.443048 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.443585 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.443636 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.471744 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6k9s\" (UniqueName: \"kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s\") pod \"certified-operators-jjl7n\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:24 crc kubenswrapper[4870]: I0130 09:16:24.762285 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:25 crc kubenswrapper[4870]: I0130 09:16:25.170577 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerStarted","Data":"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e"} Jan 30 09:16:25 crc kubenswrapper[4870]: I0130 09:16:25.371395 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:16:26 crc kubenswrapper[4870]: I0130 09:16:26.185764 4870 generic.go:334] "Generic (PLEG): container finished" podID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerID="9ce88b36b7f8bb4c84d2e0cb990ff2dfc0b7ef3e5a70fa171000086079fff96f" exitCode=0 Jan 30 09:16:26 crc kubenswrapper[4870]: I0130 09:16:26.186866 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerDied","Data":"9ce88b36b7f8bb4c84d2e0cb990ff2dfc0b7ef3e5a70fa171000086079fff96f"} Jan 30 09:16:26 crc kubenswrapper[4870]: I0130 09:16:26.186963 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerStarted","Data":"15ef4d9c1175afeaf8942eee6fe6591c948a8b9b5f0d512ace1d005e4aefce89"} Jan 30 09:16:28 crc kubenswrapper[4870]: I0130 09:16:28.229580 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerStarted","Data":"03a4e150a1883a14201e377adab34fa66160798fa54ff5e83c49cff343c943f0"} Jan 30 09:16:31 crc kubenswrapper[4870]: I0130 09:16:31.269827 4870 generic.go:334] "Generic (PLEG): container finished" podID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerID="03a4e150a1883a14201e377adab34fa66160798fa54ff5e83c49cff343c943f0" exitCode=0 Jan 30 09:16:31 crc kubenswrapper[4870]: I0130 09:16:31.269922 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerDied","Data":"03a4e150a1883a14201e377adab34fa66160798fa54ff5e83c49cff343c943f0"} Jan 30 09:16:32 crc kubenswrapper[4870]: I0130 09:16:32.286614 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerStarted","Data":"9d90852ba42919283ddf463edcc40bc4b7429a7159617021ec5a64efb04dbac5"} Jan 30 09:16:32 crc kubenswrapper[4870]: I0130 09:16:32.290149 4870 generic.go:334] "Generic (PLEG): container finished" podID="aad36608-62a8-434a-899e-2383285678ba" containerID="3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e" exitCode=0 Jan 30 09:16:32 crc kubenswrapper[4870]: I0130 09:16:32.290191 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerDied","Data":"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e"} Jan 30 09:16:32 crc kubenswrapper[4870]: I0130 09:16:32.326511 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjl7n" podStartSLOduration=2.760471499 podStartE2EDuration="8.326494217s" podCreationTimestamp="2026-01-30 09:16:24 +0000 UTC" firstStartedPulling="2026-01-30 09:16:26.197945676 +0000 UTC m=+4024.893492825" lastFinishedPulling="2026-01-30 09:16:31.763968424 +0000 UTC m=+4030.459515543" observedRunningTime="2026-01-30 09:16:32.316713862 +0000 UTC m=+4031.012260971" watchObservedRunningTime="2026-01-30 09:16:32.326494217 +0000 UTC m=+4031.022041316" Jan 30 09:16:33 crc kubenswrapper[4870]: I0130 09:16:33.331488 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerStarted","Data":"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9"} Jan 30 09:16:33 crc kubenswrapper[4870]: I0130 09:16:33.360844 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dbjtr" podStartSLOduration=3.627122035 podStartE2EDuration="13.360821339s" podCreationTimestamp="2026-01-30 09:16:20 +0000 UTC" firstStartedPulling="2026-01-30 09:16:23.151988243 +0000 UTC m=+4021.847535352" lastFinishedPulling="2026-01-30 09:16:32.885687547 +0000 UTC m=+4031.581234656" observedRunningTime="2026-01-30 09:16:33.352480429 +0000 UTC m=+4032.048027538" watchObservedRunningTime="2026-01-30 09:16:33.360821339 +0000 UTC m=+4032.056368448" Jan 30 09:16:34 crc kubenswrapper[4870]: I0130 09:16:34.762817 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:34 crc kubenswrapper[4870]: I0130 09:16:34.763219 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:35 crc kubenswrapper[4870]: I0130 09:16:35.828371 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jjl7n" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" probeResult="failure" output=< Jan 30 09:16:35 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:16:35 crc kubenswrapper[4870]: > Jan 30 09:16:41 crc kubenswrapper[4870]: I0130 09:16:41.279727 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:41 crc kubenswrapper[4870]: I0130 09:16:41.281028 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:42 crc kubenswrapper[4870]: I0130 09:16:42.351058 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbjtr" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="registry-server" probeResult="failure" output=< Jan 30 09:16:42 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:16:42 crc kubenswrapper[4870]: > Jan 30 09:16:46 crc kubenswrapper[4870]: I0130 09:16:46.281323 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jjl7n" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" probeResult="failure" output=< Jan 30 09:16:46 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:16:46 crc kubenswrapper[4870]: > Jan 30 09:16:51 crc kubenswrapper[4870]: I0130 09:16:51.373015 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:51 crc kubenswrapper[4870]: I0130 09:16:51.440645 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:52 crc kubenswrapper[4870]: I0130 09:16:52.143745 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:52 crc kubenswrapper[4870]: I0130 09:16:52.529288 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dbjtr" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="registry-server" containerID="cri-o://b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9" gracePeriod=2 Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.086598 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.197243 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content\") pod \"aad36608-62a8-434a-899e-2383285678ba\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.197339 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities\") pod \"aad36608-62a8-434a-899e-2383285678ba\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.197386 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwkbc\" (UniqueName: \"kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc\") pod \"aad36608-62a8-434a-899e-2383285678ba\" (UID: \"aad36608-62a8-434a-899e-2383285678ba\") " Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.198414 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities" (OuterVolumeSpecName: "utilities") pod "aad36608-62a8-434a-899e-2383285678ba" (UID: "aad36608-62a8-434a-899e-2383285678ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.202389 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc" (OuterVolumeSpecName: "kube-api-access-cwkbc") pod "aad36608-62a8-434a-899e-2383285678ba" (UID: "aad36608-62a8-434a-899e-2383285678ba"). InnerVolumeSpecName "kube-api-access-cwkbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.299750 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.299798 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwkbc\" (UniqueName: \"kubernetes.io/projected/aad36608-62a8-434a-899e-2383285678ba-kube-api-access-cwkbc\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.336896 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aad36608-62a8-434a-899e-2383285678ba" (UID: "aad36608-62a8-434a-899e-2383285678ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.401396 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad36608-62a8-434a-899e-2383285678ba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.543534 4870 generic.go:334] "Generic (PLEG): container finished" podID="aad36608-62a8-434a-899e-2383285678ba" containerID="b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9" exitCode=0 Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.543602 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerDied","Data":"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9"} Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.545591 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbjtr" event={"ID":"aad36608-62a8-434a-899e-2383285678ba","Type":"ContainerDied","Data":"1c7d72833670a470a189ef9ff7a4db35f8fb4ede35202afd047cf7958ef6ba71"} Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.543619 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbjtr" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.545648 4870 scope.go:117] "RemoveContainer" containerID="b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.586709 4870 scope.go:117] "RemoveContainer" containerID="3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.588057 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.598595 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dbjtr"] Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.618834 4870 scope.go:117] "RemoveContainer" containerID="52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.664009 4870 scope.go:117] "RemoveContainer" containerID="b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9" Jan 30 09:16:53 crc kubenswrapper[4870]: E0130 09:16:53.664521 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9\": container with ID starting with b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9 not found: ID does not exist" containerID="b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.664566 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9"} err="failed to get container status \"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9\": rpc error: code = NotFound desc = could not find container \"b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9\": container with ID starting with b2a057e454fb8ea1e6f538c999f8939fb5e124fde99577ab0faebaa82d0442b9 not found: ID does not exist" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.664587 4870 scope.go:117] "RemoveContainer" containerID="3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e" Jan 30 09:16:53 crc kubenswrapper[4870]: E0130 09:16:53.664816 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e\": container with ID starting with 3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e not found: ID does not exist" containerID="3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.664907 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e"} err="failed to get container status \"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e\": rpc error: code = NotFound desc = could not find container \"3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e\": container with ID starting with 3703c7a0066a6584621728d0695721b30bca7cbc891760dbc17a30dbc0409a7e not found: ID does not exist" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.664992 4870 scope.go:117] "RemoveContainer" containerID="52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b" Jan 30 09:16:53 crc kubenswrapper[4870]: E0130 09:16:53.665198 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b\": container with ID starting with 52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b not found: ID does not exist" containerID="52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b" Jan 30 09:16:53 crc kubenswrapper[4870]: I0130 09:16:53.665227 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b"} err="failed to get container status \"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b\": rpc error: code = NotFound desc = could not find container \"52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b\": container with ID starting with 52dd63245a1d7e8beb539c26a9811344da5f713cd40dc5ebd64a530ff9ba888b not found: ID does not exist" Jan 30 09:16:54 crc kubenswrapper[4870]: I0130 09:16:54.094268 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad36608-62a8-434a-899e-2383285678ba" path="/var/lib/kubelet/pods/aad36608-62a8-434a-899e-2383285678ba/volumes" Jan 30 09:16:54 crc kubenswrapper[4870]: I0130 09:16:54.815728 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:54 crc kubenswrapper[4870]: I0130 09:16:54.888054 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.141317 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.142285 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjl7n" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" containerID="cri-o://9d90852ba42919283ddf463edcc40bc4b7429a7159617021ec5a64efb04dbac5" gracePeriod=2 Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.602803 4870 generic.go:334] "Generic (PLEG): container finished" podID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerID="9d90852ba42919283ddf463edcc40bc4b7429a7159617021ec5a64efb04dbac5" exitCode=0 Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.603133 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerDied","Data":"9d90852ba42919283ddf463edcc40bc4b7429a7159617021ec5a64efb04dbac5"} Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.603186 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjl7n" event={"ID":"d63faf5c-d054-4828-b211-c0f100f1f4ca","Type":"ContainerDied","Data":"15ef4d9c1175afeaf8942eee6fe6591c948a8b9b5f0d512ace1d005e4aefce89"} Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.603200 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ef4d9c1175afeaf8942eee6fe6591c948a8b9b5f0d512ace1d005e4aefce89" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.696294 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.804859 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities\") pod \"d63faf5c-d054-4828-b211-c0f100f1f4ca\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.805312 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6k9s\" (UniqueName: \"kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s\") pod \"d63faf5c-d054-4828-b211-c0f100f1f4ca\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.805427 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content\") pod \"d63faf5c-d054-4828-b211-c0f100f1f4ca\" (UID: \"d63faf5c-d054-4828-b211-c0f100f1f4ca\") " Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.805984 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities" (OuterVolumeSpecName: "utilities") pod "d63faf5c-d054-4828-b211-c0f100f1f4ca" (UID: "d63faf5c-d054-4828-b211-c0f100f1f4ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.806322 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.812991 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s" (OuterVolumeSpecName: "kube-api-access-n6k9s") pod "d63faf5c-d054-4828-b211-c0f100f1f4ca" (UID: "d63faf5c-d054-4828-b211-c0f100f1f4ca"). InnerVolumeSpecName "kube-api-access-n6k9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.868626 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d63faf5c-d054-4828-b211-c0f100f1f4ca" (UID: "d63faf5c-d054-4828-b211-c0f100f1f4ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.908470 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6k9s\" (UniqueName: \"kubernetes.io/projected/d63faf5c-d054-4828-b211-c0f100f1f4ca-kube-api-access-n6k9s\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:57 crc kubenswrapper[4870]: I0130 09:16:57.908517 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d63faf5c-d054-4828-b211-c0f100f1f4ca-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:16:58 crc kubenswrapper[4870]: I0130 09:16:58.615272 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjl7n" Jan 30 09:16:58 crc kubenswrapper[4870]: I0130 09:16:58.647618 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:16:58 crc kubenswrapper[4870]: I0130 09:16:58.659900 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjl7n"] Jan 30 09:17:00 crc kubenswrapper[4870]: I0130 09:17:00.085387 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" path="/var/lib/kubelet/pods/d63faf5c-d054-4828-b211-c0f100f1f4ca/volumes" Jan 30 09:17:25 crc kubenswrapper[4870]: I0130 09:17:25.250320 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:17:25 crc kubenswrapper[4870]: I0130 09:17:25.251339 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.127410 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129684 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="extract-utilities" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129730 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="extract-utilities" Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129774 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="extract-content" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129800 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="extract-content" Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129822 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="extract-utilities" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129831 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="extract-utilities" Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129847 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="extract-content" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129856 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="extract-content" Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129921 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129929 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: E0130 09:17:43.129939 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.129948 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.130215 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad36608-62a8-434a-899e-2383285678ba" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.130236 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63faf5c-d054-4828-b211-c0f100f1f4ca" containerName="registry-server" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.133326 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.147115 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.225485 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.225654 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788fg\" (UniqueName: \"kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.226541 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.328339 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.328432 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788fg\" (UniqueName: \"kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.328511 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.328790 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.328835 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.361344 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788fg\" (UniqueName: \"kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg\") pod \"community-operators-xltn8\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:43 crc kubenswrapper[4870]: I0130 09:17:43.465247 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:44 crc kubenswrapper[4870]: I0130 09:17:44.070266 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:44 crc kubenswrapper[4870]: I0130 09:17:44.122909 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerStarted","Data":"540d574ff699baeb20b5d5ca0c0e348d73245fc1b9ba622032b240f6bcf3e045"} Jan 30 09:17:45 crc kubenswrapper[4870]: I0130 09:17:45.132528 4870 generic.go:334] "Generic (PLEG): container finished" podID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerID="01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646" exitCode=0 Jan 30 09:17:45 crc kubenswrapper[4870]: I0130 09:17:45.132594 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerDied","Data":"01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646"} Jan 30 09:17:45 crc kubenswrapper[4870]: I0130 09:17:45.135323 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:17:47 crc kubenswrapper[4870]: I0130 09:17:47.162415 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerStarted","Data":"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43"} Jan 30 09:17:48 crc kubenswrapper[4870]: I0130 09:17:48.177775 4870 generic.go:334] "Generic (PLEG): container finished" podID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerID="b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43" exitCode=0 Jan 30 09:17:48 crc kubenswrapper[4870]: I0130 09:17:48.177843 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerDied","Data":"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43"} Jan 30 09:17:49 crc kubenswrapper[4870]: I0130 09:17:49.190614 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerStarted","Data":"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067"} Jan 30 09:17:49 crc kubenswrapper[4870]: I0130 09:17:49.215866 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xltn8" podStartSLOduration=2.757683645 podStartE2EDuration="6.215843675s" podCreationTimestamp="2026-01-30 09:17:43 +0000 UTC" firstStartedPulling="2026-01-30 09:17:45.135115956 +0000 UTC m=+4103.830663065" lastFinishedPulling="2026-01-30 09:17:48.593275986 +0000 UTC m=+4107.288823095" observedRunningTime="2026-01-30 09:17:49.210040994 +0000 UTC m=+4107.905588103" watchObservedRunningTime="2026-01-30 09:17:49.215843675 +0000 UTC m=+4107.911390784" Jan 30 09:17:53 crc kubenswrapper[4870]: I0130 09:17:53.466179 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:53 crc kubenswrapper[4870]: I0130 09:17:53.466614 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:53 crc kubenswrapper[4870]: I0130 09:17:53.523824 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:54 crc kubenswrapper[4870]: I0130 09:17:54.291424 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:54 crc kubenswrapper[4870]: I0130 09:17:54.357171 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:55 crc kubenswrapper[4870]: I0130 09:17:55.249909 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:17:55 crc kubenswrapper[4870]: I0130 09:17:55.249970 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:17:56 crc kubenswrapper[4870]: I0130 09:17:56.255323 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xltn8" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="registry-server" containerID="cri-o://8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067" gracePeriod=2 Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.058144 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.102276 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content\") pod \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.102533 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788fg\" (UniqueName: \"kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg\") pod \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.102658 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities\") pod \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\" (UID: \"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6\") " Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.104607 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities" (OuterVolumeSpecName: "utilities") pod "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" (UID: "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.116817 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg" (OuterVolumeSpecName: "kube-api-access-788fg") pod "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" (UID: "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6"). InnerVolumeSpecName "kube-api-access-788fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.158530 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" (UID: "946c4108-ce02-48eb-b8de-4ac9d6b4e6e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.205173 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.205210 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788fg\" (UniqueName: \"kubernetes.io/projected/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-kube-api-access-788fg\") on node \"crc\" DevicePath \"\"" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.205224 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.293847 4870 generic.go:334] "Generic (PLEG): container finished" podID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerID="8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067" exitCode=0 Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.293921 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerDied","Data":"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067"} Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.293959 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xltn8" event={"ID":"946c4108-ce02-48eb-b8de-4ac9d6b4e6e6","Type":"ContainerDied","Data":"540d574ff699baeb20b5d5ca0c0e348d73245fc1b9ba622032b240f6bcf3e045"} Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.293978 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xltn8" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.293985 4870 scope.go:117] "RemoveContainer" containerID="8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.334379 4870 scope.go:117] "RemoveContainer" containerID="b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.356712 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.367781 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xltn8"] Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.374174 4870 scope.go:117] "RemoveContainer" containerID="01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.446736 4870 scope.go:117] "RemoveContainer" containerID="8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067" Jan 30 09:17:57 crc kubenswrapper[4870]: E0130 09:17:57.451027 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067\": container with ID starting with 8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067 not found: ID does not exist" containerID="8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.451067 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067"} err="failed to get container status \"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067\": rpc error: code = NotFound desc = could not find container \"8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067\": container with ID starting with 8a2e5018eed8f0b1d1186c606d6a4aaf2a29e90c9c34afae09d54ab37b3a0067 not found: ID does not exist" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.451097 4870 scope.go:117] "RemoveContainer" containerID="b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43" Jan 30 09:17:57 crc kubenswrapper[4870]: E0130 09:17:57.451651 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43\": container with ID starting with b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43 not found: ID does not exist" containerID="b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.451692 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43"} err="failed to get container status \"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43\": rpc error: code = NotFound desc = could not find container \"b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43\": container with ID starting with b645803acf590699cfe77d3b6594c393ad712f63f991d278bb28576169b9fe43 not found: ID does not exist" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.451711 4870 scope.go:117] "RemoveContainer" containerID="01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646" Jan 30 09:17:57 crc kubenswrapper[4870]: E0130 09:17:57.453119 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646\": container with ID starting with 01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646 not found: ID does not exist" containerID="01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646" Jan 30 09:17:57 crc kubenswrapper[4870]: I0130 09:17:57.453176 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646"} err="failed to get container status \"01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646\": rpc error: code = NotFound desc = could not find container \"01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646\": container with ID starting with 01ea8007b3bb0cbc4e47aa121bb382d34da19f0d27e1f3c77abeddb5f819b646 not found: ID does not exist" Jan 30 09:17:58 crc kubenswrapper[4870]: I0130 09:17:58.089197 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" path="/var/lib/kubelet/pods/946c4108-ce02-48eb-b8de-4ac9d6b4e6e6/volumes" Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.249545 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.250137 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.250190 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.251113 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.251164 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b" gracePeriod=600 Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.615900 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b" exitCode=0 Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.616014 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b"} Jan 30 09:18:25 crc kubenswrapper[4870]: I0130 09:18:25.616537 4870 scope.go:117] "RemoveContainer" containerID="b46440b22ae9d2fa04397a53f0f4fc091b7e2f214fbae6daba34107e15d81271" Jan 30 09:18:26 crc kubenswrapper[4870]: I0130 09:18:26.631082 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27"} Jan 30 09:20:25 crc kubenswrapper[4870]: I0130 09:20:25.249579 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:20:25 crc kubenswrapper[4870]: I0130 09:20:25.250542 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:20:55 crc kubenswrapper[4870]: I0130 09:20:55.250093 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:20:55 crc kubenswrapper[4870]: I0130 09:20:55.250972 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.250187 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.251036 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.251121 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.252343 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.252437 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" gracePeriod=600 Jan 30 09:21:25 crc kubenswrapper[4870]: E0130 09:21:25.384276 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.631759 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" exitCode=0 Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.631805 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27"} Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.631842 4870 scope.go:117] "RemoveContainer" containerID="78ce0353b7c3323cf1e9811378a7fe1749f61d99675d2c6c610bade4f28a064b" Jan 30 09:21:25 crc kubenswrapper[4870]: I0130 09:21:25.632537 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:21:25 crc kubenswrapper[4870]: E0130 09:21:25.632916 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:21:40 crc kubenswrapper[4870]: I0130 09:21:40.075554 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:21:40 crc kubenswrapper[4870]: E0130 09:21:40.076623 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:21:51 crc kubenswrapper[4870]: I0130 09:21:51.075531 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:21:51 crc kubenswrapper[4870]: E0130 09:21:51.076950 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:22:04 crc kubenswrapper[4870]: I0130 09:22:04.074985 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:22:04 crc kubenswrapper[4870]: E0130 09:22:04.075819 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.597507 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:07 crc kubenswrapper[4870]: E0130 09:22:07.598762 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="extract-content" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.598783 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="extract-content" Jan 30 09:22:07 crc kubenswrapper[4870]: E0130 09:22:07.598808 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="registry-server" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.598822 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="registry-server" Jan 30 09:22:07 crc kubenswrapper[4870]: E0130 09:22:07.598864 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="extract-utilities" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.598898 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="extract-utilities" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.599281 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="946c4108-ce02-48eb-b8de-4ac9d6b4e6e6" containerName="registry-server" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.601905 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.610864 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.632370 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cbc\" (UniqueName: \"kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.632433 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.632540 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.735058 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cbc\" (UniqueName: \"kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.735135 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.735240 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.735928 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.736507 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.768027 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cbc\" (UniqueName: \"kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc\") pod \"redhat-marketplace-pvdlz\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:07 crc kubenswrapper[4870]: I0130 09:22:07.951465 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:08 crc kubenswrapper[4870]: I0130 09:22:08.458496 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:08 crc kubenswrapper[4870]: W0130 09:22:08.462108 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557c0167_03e1_4176_89dc_88cbef924f2d.slice/crio-0aa9bcaea6ba301b86069968b571da51ccce8795756e555aa4d98a6aa66c8021 WatchSource:0}: Error finding container 0aa9bcaea6ba301b86069968b571da51ccce8795756e555aa4d98a6aa66c8021: Status 404 returned error can't find the container with id 0aa9bcaea6ba301b86069968b571da51ccce8795756e555aa4d98a6aa66c8021 Jan 30 09:22:09 crc kubenswrapper[4870]: I0130 09:22:09.124635 4870 generic.go:334] "Generic (PLEG): container finished" podID="557c0167-03e1-4176-89dc-88cbef924f2d" containerID="85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67" exitCode=0 Jan 30 09:22:09 crc kubenswrapper[4870]: I0130 09:22:09.125024 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerDied","Data":"85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67"} Jan 30 09:22:09 crc kubenswrapper[4870]: I0130 09:22:09.125056 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerStarted","Data":"0aa9bcaea6ba301b86069968b571da51ccce8795756e555aa4d98a6aa66c8021"} Jan 30 09:22:10 crc kubenswrapper[4870]: I0130 09:22:10.135853 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerStarted","Data":"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc"} Jan 30 09:22:11 crc kubenswrapper[4870]: I0130 09:22:11.147542 4870 generic.go:334] "Generic (PLEG): container finished" podID="557c0167-03e1-4176-89dc-88cbef924f2d" containerID="2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc" exitCode=0 Jan 30 09:22:11 crc kubenswrapper[4870]: I0130 09:22:11.147762 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerDied","Data":"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc"} Jan 30 09:22:12 crc kubenswrapper[4870]: I0130 09:22:12.160237 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerStarted","Data":"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6"} Jan 30 09:22:12 crc kubenswrapper[4870]: I0130 09:22:12.193413 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvdlz" podStartSLOduration=2.754915098 podStartE2EDuration="5.193391707s" podCreationTimestamp="2026-01-30 09:22:07 +0000 UTC" firstStartedPulling="2026-01-30 09:22:09.128734253 +0000 UTC m=+4367.824281362" lastFinishedPulling="2026-01-30 09:22:11.567210852 +0000 UTC m=+4370.262757971" observedRunningTime="2026-01-30 09:22:12.180821514 +0000 UTC m=+4370.876368623" watchObservedRunningTime="2026-01-30 09:22:12.193391707 +0000 UTC m=+4370.888938816" Jan 30 09:22:15 crc kubenswrapper[4870]: I0130 09:22:15.074623 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:22:15 crc kubenswrapper[4870]: E0130 09:22:15.075411 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:22:17 crc kubenswrapper[4870]: I0130 09:22:17.952249 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:17 crc kubenswrapper[4870]: I0130 09:22:17.952909 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:17 crc kubenswrapper[4870]: I0130 09:22:17.996782 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:18 crc kubenswrapper[4870]: I0130 09:22:18.269066 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:18 crc kubenswrapper[4870]: I0130 09:22:18.323384 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.234006 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvdlz" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="registry-server" containerID="cri-o://1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6" gracePeriod=2 Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.765739 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.869214 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2cbc\" (UniqueName: \"kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc\") pod \"557c0167-03e1-4176-89dc-88cbef924f2d\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.869371 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities\") pod \"557c0167-03e1-4176-89dc-88cbef924f2d\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.869444 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content\") pod \"557c0167-03e1-4176-89dc-88cbef924f2d\" (UID: \"557c0167-03e1-4176-89dc-88cbef924f2d\") " Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.870387 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities" (OuterVolumeSpecName: "utilities") pod "557c0167-03e1-4176-89dc-88cbef924f2d" (UID: "557c0167-03e1-4176-89dc-88cbef924f2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.876225 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc" (OuterVolumeSpecName: "kube-api-access-b2cbc") pod "557c0167-03e1-4176-89dc-88cbef924f2d" (UID: "557c0167-03e1-4176-89dc-88cbef924f2d"). InnerVolumeSpecName "kube-api-access-b2cbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.893502 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "557c0167-03e1-4176-89dc-88cbef924f2d" (UID: "557c0167-03e1-4176-89dc-88cbef924f2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.973101 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.973155 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2cbc\" (UniqueName: \"kubernetes.io/projected/557c0167-03e1-4176-89dc-88cbef924f2d-kube-api-access-b2cbc\") on node \"crc\" DevicePath \"\"" Jan 30 09:22:20 crc kubenswrapper[4870]: I0130 09:22:20.973178 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557c0167-03e1-4176-89dc-88cbef924f2d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.253139 4870 generic.go:334] "Generic (PLEG): container finished" podID="557c0167-03e1-4176-89dc-88cbef924f2d" containerID="1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6" exitCode=0 Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.253204 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerDied","Data":"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6"} Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.253243 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvdlz" event={"ID":"557c0167-03e1-4176-89dc-88cbef924f2d","Type":"ContainerDied","Data":"0aa9bcaea6ba301b86069968b571da51ccce8795756e555aa4d98a6aa66c8021"} Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.253265 4870 scope.go:117] "RemoveContainer" containerID="1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.253448 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvdlz" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.294613 4870 scope.go:117] "RemoveContainer" containerID="2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.314933 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.316734 4870 scope.go:117] "RemoveContainer" containerID="85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.327319 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvdlz"] Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.372833 4870 scope.go:117] "RemoveContainer" containerID="1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6" Jan 30 09:22:21 crc kubenswrapper[4870]: E0130 09:22:21.373492 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6\": container with ID starting with 1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6 not found: ID does not exist" containerID="1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.373524 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6"} err="failed to get container status \"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6\": rpc error: code = NotFound desc = could not find container \"1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6\": container with ID starting with 1e69adb24152ee0cb8f50d991a0404e146f90566276c56f5dafb33d4ffb7ffa6 not found: ID does not exist" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.373565 4870 scope.go:117] "RemoveContainer" containerID="2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc" Jan 30 09:22:21 crc kubenswrapper[4870]: E0130 09:22:21.373958 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc\": container with ID starting with 2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc not found: ID does not exist" containerID="2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.374004 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc"} err="failed to get container status \"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc\": rpc error: code = NotFound desc = could not find container \"2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc\": container with ID starting with 2c0a1e1837f6a0f60772e976ef229a93da2a31800f3a31dcc7e73fd386908bfc not found: ID does not exist" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.374034 4870 scope.go:117] "RemoveContainer" containerID="85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67" Jan 30 09:22:21 crc kubenswrapper[4870]: E0130 09:22:21.374354 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67\": container with ID starting with 85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67 not found: ID does not exist" containerID="85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67" Jan 30 09:22:21 crc kubenswrapper[4870]: I0130 09:22:21.374394 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67"} err="failed to get container status \"85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67\": rpc error: code = NotFound desc = could not find container \"85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67\": container with ID starting with 85d75aa3df5ba51083766d98dab3c127cb48ae1d8b08b9050b15950b6ccc4c67 not found: ID does not exist" Jan 30 09:22:22 crc kubenswrapper[4870]: I0130 09:22:22.093928 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" path="/var/lib/kubelet/pods/557c0167-03e1-4176-89dc-88cbef924f2d/volumes" Jan 30 09:22:27 crc kubenswrapper[4870]: I0130 09:22:27.074973 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:22:27 crc kubenswrapper[4870]: E0130 09:22:27.075632 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:22:38 crc kubenswrapper[4870]: I0130 09:22:38.075012 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:22:38 crc kubenswrapper[4870]: E0130 09:22:38.075612 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:22:38 crc kubenswrapper[4870]: I0130 09:22:38.962934 4870 scope.go:117] "RemoveContainer" containerID="03a4e150a1883a14201e377adab34fa66160798fa54ff5e83c49cff343c943f0" Jan 30 09:22:39 crc kubenswrapper[4870]: I0130 09:22:39.004308 4870 scope.go:117] "RemoveContainer" containerID="9d90852ba42919283ddf463edcc40bc4b7429a7159617021ec5a64efb04dbac5" Jan 30 09:22:39 crc kubenswrapper[4870]: I0130 09:22:39.047923 4870 scope.go:117] "RemoveContainer" containerID="9ce88b36b7f8bb4c84d2e0cb990ff2dfc0b7ef3e5a70fa171000086079fff96f" Jan 30 09:22:49 crc kubenswrapper[4870]: I0130 09:22:49.075412 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:22:49 crc kubenswrapper[4870]: E0130 09:22:49.076258 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:23:02 crc kubenswrapper[4870]: I0130 09:23:02.089238 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:23:02 crc kubenswrapper[4870]: E0130 09:23:02.090239 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:23:15 crc kubenswrapper[4870]: I0130 09:23:15.074528 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:23:15 crc kubenswrapper[4870]: E0130 09:23:15.075473 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:23:26 crc kubenswrapper[4870]: I0130 09:23:26.075349 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:23:26 crc kubenswrapper[4870]: E0130 09:23:26.076188 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:23:38 crc kubenswrapper[4870]: I0130 09:23:38.075539 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:23:38 crc kubenswrapper[4870]: E0130 09:23:38.076433 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:23:50 crc kubenswrapper[4870]: I0130 09:23:50.075495 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:23:50 crc kubenswrapper[4870]: E0130 09:23:50.076370 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:24:05 crc kubenswrapper[4870]: I0130 09:24:05.076041 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:24:05 crc kubenswrapper[4870]: E0130 09:24:05.076831 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:24:17 crc kubenswrapper[4870]: I0130 09:24:17.075335 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:24:17 crc kubenswrapper[4870]: E0130 09:24:17.076274 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:24:32 crc kubenswrapper[4870]: I0130 09:24:32.082102 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:24:32 crc kubenswrapper[4870]: E0130 09:24:32.083040 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:24:46 crc kubenswrapper[4870]: I0130 09:24:46.074564 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:24:46 crc kubenswrapper[4870]: E0130 09:24:46.075635 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:24:57 crc kubenswrapper[4870]: I0130 09:24:57.076236 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:24:57 crc kubenswrapper[4870]: E0130 09:24:57.077589 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:25:08 crc kubenswrapper[4870]: I0130 09:25:08.074974 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:25:08 crc kubenswrapper[4870]: E0130 09:25:08.075764 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:25:22 crc kubenswrapper[4870]: I0130 09:25:22.082183 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:25:22 crc kubenswrapper[4870]: E0130 09:25:22.082917 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:25:34 crc kubenswrapper[4870]: I0130 09:25:34.075085 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:25:34 crc kubenswrapper[4870]: E0130 09:25:34.075806 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:25:46 crc kubenswrapper[4870]: I0130 09:25:46.074648 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:25:46 crc kubenswrapper[4870]: E0130 09:25:46.075527 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:26:01 crc kubenswrapper[4870]: I0130 09:26:01.075146 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:26:01 crc kubenswrapper[4870]: E0130 09:26:01.075993 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:26:14 crc kubenswrapper[4870]: I0130 09:26:14.074670 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:26:14 crc kubenswrapper[4870]: E0130 09:26:14.075423 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:26:29 crc kubenswrapper[4870]: I0130 09:26:29.075226 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:26:29 crc kubenswrapper[4870]: I0130 09:26:29.776942 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af"} Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.903575 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:30 crc kubenswrapper[4870]: E0130 09:26:30.904569 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="extract-utilities" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.904585 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="extract-utilities" Jan 30 09:26:30 crc kubenswrapper[4870]: E0130 09:26:30.904623 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="extract-content" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.904631 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="extract-content" Jan 30 09:26:30 crc kubenswrapper[4870]: E0130 09:26:30.904647 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="registry-server" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.904654 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="registry-server" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.904859 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="557c0167-03e1-4176-89dc-88cbef924f2d" containerName="registry-server" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.906319 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:30 crc kubenswrapper[4870]: I0130 09:26:30.917340 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.018808 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.019017 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.019351 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88wm\" (UniqueName: \"kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.120757 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.120849 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.120991 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88wm\" (UniqueName: \"kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.121268 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.121507 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.148973 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88wm\" (UniqueName: \"kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm\") pod \"redhat-operators-qrtns\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.225896 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.744413 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:31 crc kubenswrapper[4870]: I0130 09:26:31.798380 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerStarted","Data":"0660a37d9a474051d2be6f69ed9530a611f3c39fa1b6766160ab0c3404bd0861"} Jan 30 09:26:32 crc kubenswrapper[4870]: I0130 09:26:32.809089 4870 generic.go:334] "Generic (PLEG): container finished" podID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerID="08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2" exitCode=0 Jan 30 09:26:32 crc kubenswrapper[4870]: I0130 09:26:32.809534 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerDied","Data":"08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2"} Jan 30 09:26:32 crc kubenswrapper[4870]: I0130 09:26:32.811949 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:26:33 crc kubenswrapper[4870]: I0130 09:26:33.820140 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerStarted","Data":"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f"} Jan 30 09:26:37 crc kubenswrapper[4870]: I0130 09:26:37.862255 4870 generic.go:334] "Generic (PLEG): container finished" podID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerID="25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f" exitCode=0 Jan 30 09:26:37 crc kubenswrapper[4870]: I0130 09:26:37.862345 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerDied","Data":"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f"} Jan 30 09:26:38 crc kubenswrapper[4870]: I0130 09:26:38.874564 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerStarted","Data":"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd"} Jan 30 09:26:38 crc kubenswrapper[4870]: I0130 09:26:38.908252 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrtns" podStartSLOduration=3.466153904 podStartE2EDuration="8.908230892s" podCreationTimestamp="2026-01-30 09:26:30 +0000 UTC" firstStartedPulling="2026-01-30 09:26:32.811730439 +0000 UTC m=+4631.507277548" lastFinishedPulling="2026-01-30 09:26:38.253807427 +0000 UTC m=+4636.949354536" observedRunningTime="2026-01-30 09:26:38.895287019 +0000 UTC m=+4637.590834158" watchObservedRunningTime="2026-01-30 09:26:38.908230892 +0000 UTC m=+4637.603778011" Jan 30 09:26:41 crc kubenswrapper[4870]: I0130 09:26:41.226905 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:41 crc kubenswrapper[4870]: I0130 09:26:41.227202 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:42 crc kubenswrapper[4870]: I0130 09:26:42.293172 4870 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrtns" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="registry-server" probeResult="failure" output=< Jan 30 09:26:42 crc kubenswrapper[4870]: timeout: failed to connect service ":50051" within 1s Jan 30 09:26:42 crc kubenswrapper[4870]: > Jan 30 09:26:51 crc kubenswrapper[4870]: I0130 09:26:51.275359 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:51 crc kubenswrapper[4870]: I0130 09:26:51.345855 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:51 crc kubenswrapper[4870]: I0130 09:26:51.530483 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:52 crc kubenswrapper[4870]: I0130 09:26:52.998467 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrtns" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="registry-server" containerID="cri-o://10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd" gracePeriod=2 Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.502165 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.624780 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities\") pod \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.624943 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content\") pod \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.625229 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f88wm\" (UniqueName: \"kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm\") pod \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\" (UID: \"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d\") " Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.625646 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities" (OuterVolumeSpecName: "utilities") pod "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" (UID: "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.626026 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.630808 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm" (OuterVolumeSpecName: "kube-api-access-f88wm") pod "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" (UID: "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d"). InnerVolumeSpecName "kube-api-access-f88wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.727696 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f88wm\" (UniqueName: \"kubernetes.io/projected/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-kube-api-access-f88wm\") on node \"crc\" DevicePath \"\"" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.756437 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" (UID: "4d55e5ff-73ca-40fa-9bbc-f032d7195b6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:26:53 crc kubenswrapper[4870]: I0130 09:26:53.830010 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.008827 4870 generic.go:334] "Generic (PLEG): container finished" podID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerID="10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd" exitCode=0 Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.008892 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerDied","Data":"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd"} Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.008936 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrtns" event={"ID":"4d55e5ff-73ca-40fa-9bbc-f032d7195b6d","Type":"ContainerDied","Data":"0660a37d9a474051d2be6f69ed9530a611f3c39fa1b6766160ab0c3404bd0861"} Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.008961 4870 scope.go:117] "RemoveContainer" containerID="10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.008983 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrtns" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.039918 4870 scope.go:117] "RemoveContainer" containerID="25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.057920 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.069073 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrtns"] Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.072269 4870 scope.go:117] "RemoveContainer" containerID="08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.089441 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" path="/var/lib/kubelet/pods/4d55e5ff-73ca-40fa-9bbc-f032d7195b6d/volumes" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.114868 4870 scope.go:117] "RemoveContainer" containerID="10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd" Jan 30 09:26:54 crc kubenswrapper[4870]: E0130 09:26:54.115463 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd\": container with ID starting with 10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd not found: ID does not exist" containerID="10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.115513 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd"} err="failed to get container status \"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd\": rpc error: code = NotFound desc = could not find container \"10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd\": container with ID starting with 10f1aa6ed3a026db9825b6e2ad444251698959010b5269ca5d7e71c3c67a68cd not found: ID does not exist" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.115544 4870 scope.go:117] "RemoveContainer" containerID="25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f" Jan 30 09:26:54 crc kubenswrapper[4870]: E0130 09:26:54.116016 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f\": container with ID starting with 25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f not found: ID does not exist" containerID="25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.116070 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f"} err="failed to get container status \"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f\": rpc error: code = NotFound desc = could not find container \"25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f\": container with ID starting with 25425c03452a27224c3adc16dcb78d6dbea2801599a50fce22f2fc38ab82723f not found: ID does not exist" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.116088 4870 scope.go:117] "RemoveContainer" containerID="08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2" Jan 30 09:26:54 crc kubenswrapper[4870]: E0130 09:26:54.116501 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2\": container with ID starting with 08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2 not found: ID does not exist" containerID="08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2" Jan 30 09:26:54 crc kubenswrapper[4870]: I0130 09:26:54.116529 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2"} err="failed to get container status \"08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2\": rpc error: code = NotFound desc = could not find container \"08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2\": container with ID starting with 08325f1a246b7fc3549417df49a465c23ef614d6d2f42b2a262b961d03a3c1b2 not found: ID does not exist" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.194377 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:05 crc kubenswrapper[4870]: E0130 09:27:05.195492 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="extract-content" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.195511 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="extract-content" Jan 30 09:27:05 crc kubenswrapper[4870]: E0130 09:27:05.195529 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="extract-utilities" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.195536 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="extract-utilities" Jan 30 09:27:05 crc kubenswrapper[4870]: E0130 09:27:05.195546 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="registry-server" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.195553 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="registry-server" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.195829 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d55e5ff-73ca-40fa-9bbc-f032d7195b6d" containerName="registry-server" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.197649 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.228019 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.400563 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.400641 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.400671 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gkq\" (UniqueName: \"kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.502756 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.502889 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.503086 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gkq\" (UniqueName: \"kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.503556 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.503637 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.526784 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gkq\" (UniqueName: \"kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq\") pod \"certified-operators-8l4jx\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:05 crc kubenswrapper[4870]: I0130 09:27:05.822503 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:06 crc kubenswrapper[4870]: I0130 09:27:06.353695 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:07 crc kubenswrapper[4870]: I0130 09:27:07.131037 4870 generic.go:334] "Generic (PLEG): container finished" podID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerID="983be6322e2103d41550f8f7f6e0e561cbc3dbe9269392d7c5c2526c4f9b5d63" exitCode=0 Jan 30 09:27:07 crc kubenswrapper[4870]: I0130 09:27:07.131120 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerDied","Data":"983be6322e2103d41550f8f7f6e0e561cbc3dbe9269392d7c5c2526c4f9b5d63"} Jan 30 09:27:07 crc kubenswrapper[4870]: I0130 09:27:07.131356 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerStarted","Data":"d79530e1301adb8383a69357cf81b26a5838467cb60dbf7859d6ecbdfa5dea14"} Jan 30 09:27:08 crc kubenswrapper[4870]: I0130 09:27:08.142451 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerStarted","Data":"644b76bd8bb66c6ebaa0efcd433e78ebb9485645651bbb2b9c8e69d9cca2a794"} Jan 30 09:27:10 crc kubenswrapper[4870]: I0130 09:27:10.160699 4870 generic.go:334] "Generic (PLEG): container finished" podID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerID="644b76bd8bb66c6ebaa0efcd433e78ebb9485645651bbb2b9c8e69d9cca2a794" exitCode=0 Jan 30 09:27:10 crc kubenswrapper[4870]: I0130 09:27:10.161043 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerDied","Data":"644b76bd8bb66c6ebaa0efcd433e78ebb9485645651bbb2b9c8e69d9cca2a794"} Jan 30 09:27:11 crc kubenswrapper[4870]: I0130 09:27:11.171424 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerStarted","Data":"59ad2bf5c789adc8979794468c11cf52aedf813fac54e74a61ef08e9e9826d50"} Jan 30 09:27:11 crc kubenswrapper[4870]: I0130 09:27:11.193452 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8l4jx" podStartSLOduration=2.7156367340000003 podStartE2EDuration="6.193435496s" podCreationTimestamp="2026-01-30 09:27:05 +0000 UTC" firstStartedPulling="2026-01-30 09:27:07.133190114 +0000 UTC m=+4665.828737223" lastFinishedPulling="2026-01-30 09:27:10.610988876 +0000 UTC m=+4669.306535985" observedRunningTime="2026-01-30 09:27:11.190952609 +0000 UTC m=+4669.886499728" watchObservedRunningTime="2026-01-30 09:27:11.193435496 +0000 UTC m=+4669.888982605" Jan 30 09:27:15 crc kubenswrapper[4870]: I0130 09:27:15.823632 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:15 crc kubenswrapper[4870]: I0130 09:27:15.824229 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:15 crc kubenswrapper[4870]: I0130 09:27:15.877697 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:16 crc kubenswrapper[4870]: I0130 09:27:16.265477 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:16 crc kubenswrapper[4870]: I0130 09:27:16.317665 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:18 crc kubenswrapper[4870]: I0130 09:27:18.240798 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8l4jx" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="registry-server" containerID="cri-o://59ad2bf5c789adc8979794468c11cf52aedf813fac54e74a61ef08e9e9826d50" gracePeriod=2 Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.251670 4870 generic.go:334] "Generic (PLEG): container finished" podID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerID="59ad2bf5c789adc8979794468c11cf52aedf813fac54e74a61ef08e9e9826d50" exitCode=0 Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.251740 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerDied","Data":"59ad2bf5c789adc8979794468c11cf52aedf813fac54e74a61ef08e9e9826d50"} Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.604347 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.632462 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content\") pod \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.632624 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities\") pod \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.632694 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9gkq\" (UniqueName: \"kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq\") pod \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\" (UID: \"247e00dc-e547-4b0c-802d-7e7ef8dd8b58\") " Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.633689 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities" (OuterVolumeSpecName: "utilities") pod "247e00dc-e547-4b0c-802d-7e7ef8dd8b58" (UID: "247e00dc-e547-4b0c-802d-7e7ef8dd8b58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.647293 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq" (OuterVolumeSpecName: "kube-api-access-b9gkq") pod "247e00dc-e547-4b0c-802d-7e7ef8dd8b58" (UID: "247e00dc-e547-4b0c-802d-7e7ef8dd8b58"). InnerVolumeSpecName "kube-api-access-b9gkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.699920 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247e00dc-e547-4b0c-802d-7e7ef8dd8b58" (UID: "247e00dc-e547-4b0c-802d-7e7ef8dd8b58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.734846 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.734894 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9gkq\" (UniqueName: \"kubernetes.io/projected/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-kube-api-access-b9gkq\") on node \"crc\" DevicePath \"\"" Jan 30 09:27:19 crc kubenswrapper[4870]: I0130 09:27:19.734906 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247e00dc-e547-4b0c-802d-7e7ef8dd8b58-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.263950 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8l4jx" event={"ID":"247e00dc-e547-4b0c-802d-7e7ef8dd8b58","Type":"ContainerDied","Data":"d79530e1301adb8383a69357cf81b26a5838467cb60dbf7859d6ecbdfa5dea14"} Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.264031 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8l4jx" Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.264339 4870 scope.go:117] "RemoveContainer" containerID="59ad2bf5c789adc8979794468c11cf52aedf813fac54e74a61ef08e9e9826d50" Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.293254 4870 scope.go:117] "RemoveContainer" containerID="644b76bd8bb66c6ebaa0efcd433e78ebb9485645651bbb2b9c8e69d9cca2a794" Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.294160 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.321940 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8l4jx"] Jan 30 09:27:20 crc kubenswrapper[4870]: I0130 09:27:20.323678 4870 scope.go:117] "RemoveContainer" containerID="983be6322e2103d41550f8f7f6e0e561cbc3dbe9269392d7c5c2526c4f9b5d63" Jan 30 09:27:22 crc kubenswrapper[4870]: I0130 09:27:22.087774 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" path="/var/lib/kubelet/pods/247e00dc-e547-4b0c-802d-7e7ef8dd8b58/volumes" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.419301 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:02 crc kubenswrapper[4870]: E0130 09:28:02.420323 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="registry-server" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.420340 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="registry-server" Jan 30 09:28:02 crc kubenswrapper[4870]: E0130 09:28:02.420355 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="extract-content" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.420363 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="extract-content" Jan 30 09:28:02 crc kubenswrapper[4870]: E0130 09:28:02.420389 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="extract-utilities" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.420399 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="extract-utilities" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.420656 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="247e00dc-e547-4b0c-802d-7e7ef8dd8b58" containerName="registry-server" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.422540 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.440133 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfpfj\" (UniqueName: \"kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.440302 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.440470 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.443923 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.543001 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.543117 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfpfj\" (UniqueName: \"kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.543282 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.543686 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.543808 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.574814 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfpfj\" (UniqueName: \"kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj\") pod \"community-operators-btswt\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:02 crc kubenswrapper[4870]: I0130 09:28:02.769096 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:03 crc kubenswrapper[4870]: I0130 09:28:03.286338 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:03 crc kubenswrapper[4870]: I0130 09:28:03.680170 4870 generic.go:334] "Generic (PLEG): container finished" podID="0cda615d-6f79-49fc-812e-28f590544089" containerID="e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef" exitCode=0 Jan 30 09:28:03 crc kubenswrapper[4870]: I0130 09:28:03.680209 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerDied","Data":"e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef"} Jan 30 09:28:03 crc kubenswrapper[4870]: I0130 09:28:03.680235 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerStarted","Data":"d1f2f6a834009ccba40130c3535b999d1113adab0a7f3036f55f87b8dcfc3789"} Jan 30 09:28:04 crc kubenswrapper[4870]: I0130 09:28:04.694048 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerStarted","Data":"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d"} Jan 30 09:28:05 crc kubenswrapper[4870]: I0130 09:28:05.705994 4870 generic.go:334] "Generic (PLEG): container finished" podID="0cda615d-6f79-49fc-812e-28f590544089" containerID="7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d" exitCode=0 Jan 30 09:28:05 crc kubenswrapper[4870]: I0130 09:28:05.706044 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerDied","Data":"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d"} Jan 30 09:28:06 crc kubenswrapper[4870]: I0130 09:28:06.717459 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerStarted","Data":"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4"} Jan 30 09:28:06 crc kubenswrapper[4870]: I0130 09:28:06.742419 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-btswt" podStartSLOduration=2.315169855 podStartE2EDuration="4.742401852s" podCreationTimestamp="2026-01-30 09:28:02 +0000 UTC" firstStartedPulling="2026-01-30 09:28:03.681724454 +0000 UTC m=+4722.377271563" lastFinishedPulling="2026-01-30 09:28:06.108956451 +0000 UTC m=+4724.804503560" observedRunningTime="2026-01-30 09:28:06.734757723 +0000 UTC m=+4725.430304842" watchObservedRunningTime="2026-01-30 09:28:06.742401852 +0000 UTC m=+4725.437948961" Jan 30 09:28:12 crc kubenswrapper[4870]: I0130 09:28:12.770139 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:12 crc kubenswrapper[4870]: I0130 09:28:12.770951 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:13 crc kubenswrapper[4870]: I0130 09:28:13.059080 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:13 crc kubenswrapper[4870]: I0130 09:28:13.842196 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:13 crc kubenswrapper[4870]: I0130 09:28:13.900535 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:15 crc kubenswrapper[4870]: I0130 09:28:15.808752 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-btswt" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="registry-server" containerID="cri-o://5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4" gracePeriod=2 Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.311126 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.425919 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content\") pod \"0cda615d-6f79-49fc-812e-28f590544089\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.426370 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfpfj\" (UniqueName: \"kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj\") pod \"0cda615d-6f79-49fc-812e-28f590544089\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.426495 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities\") pod \"0cda615d-6f79-49fc-812e-28f590544089\" (UID: \"0cda615d-6f79-49fc-812e-28f590544089\") " Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.427289 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities" (OuterVolumeSpecName: "utilities") pod "0cda615d-6f79-49fc-812e-28f590544089" (UID: "0cda615d-6f79-49fc-812e-28f590544089"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.436185 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj" (OuterVolumeSpecName: "kube-api-access-tfpfj") pod "0cda615d-6f79-49fc-812e-28f590544089" (UID: "0cda615d-6f79-49fc-812e-28f590544089"). InnerVolumeSpecName "kube-api-access-tfpfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.485762 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cda615d-6f79-49fc-812e-28f590544089" (UID: "0cda615d-6f79-49fc-812e-28f590544089"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.529286 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.529321 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfpfj\" (UniqueName: \"kubernetes.io/projected/0cda615d-6f79-49fc-812e-28f590544089-kube-api-access-tfpfj\") on node \"crc\" DevicePath \"\"" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.529332 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cda615d-6f79-49fc-812e-28f590544089-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.823509 4870 generic.go:334] "Generic (PLEG): container finished" podID="0cda615d-6f79-49fc-812e-28f590544089" containerID="5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4" exitCode=0 Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.823603 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerDied","Data":"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4"} Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.823676 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btswt" event={"ID":"0cda615d-6f79-49fc-812e-28f590544089","Type":"ContainerDied","Data":"d1f2f6a834009ccba40130c3535b999d1113adab0a7f3036f55f87b8dcfc3789"} Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.823711 4870 scope.go:117] "RemoveContainer" containerID="5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.823729 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btswt" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.874141 4870 scope.go:117] "RemoveContainer" containerID="7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.878381 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.889275 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-btswt"] Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.903690 4870 scope.go:117] "RemoveContainer" containerID="e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.966318 4870 scope.go:117] "RemoveContainer" containerID="5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4" Jan 30 09:28:16 crc kubenswrapper[4870]: E0130 09:28:16.966845 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4\": container with ID starting with 5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4 not found: ID does not exist" containerID="5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.966975 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4"} err="failed to get container status \"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4\": rpc error: code = NotFound desc = could not find container \"5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4\": container with ID starting with 5d6616f5ad39822e7b0feb0d3252a9aefe485e126286e84b2af8bcf7d03012d4 not found: ID does not exist" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.967011 4870 scope.go:117] "RemoveContainer" containerID="7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d" Jan 30 09:28:16 crc kubenswrapper[4870]: E0130 09:28:16.967534 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d\": container with ID starting with 7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d not found: ID does not exist" containerID="7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.967576 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d"} err="failed to get container status \"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d\": rpc error: code = NotFound desc = could not find container \"7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d\": container with ID starting with 7135d46b8f44718d9556e5e7f714ca0ee6bd5405089d670de1b83da549dbcc6d not found: ID does not exist" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.967601 4870 scope.go:117] "RemoveContainer" containerID="e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef" Jan 30 09:28:16 crc kubenswrapper[4870]: E0130 09:28:16.968163 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef\": container with ID starting with e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef not found: ID does not exist" containerID="e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef" Jan 30 09:28:16 crc kubenswrapper[4870]: I0130 09:28:16.968210 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef"} err="failed to get container status \"e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef\": rpc error: code = NotFound desc = could not find container \"e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef\": container with ID starting with e5e3604dc8b68af28314816ffb734f27577b48850481d2510bb0a559fe9a53ef not found: ID does not exist" Jan 30 09:28:18 crc kubenswrapper[4870]: I0130 09:28:18.087535 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cda615d-6f79-49fc-812e-28f590544089" path="/var/lib/kubelet/pods/0cda615d-6f79-49fc-812e-28f590544089/volumes" Jan 30 09:28:55 crc kubenswrapper[4870]: I0130 09:28:55.249105 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:28:55 crc kubenswrapper[4870]: I0130 09:28:55.249733 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:29:25 crc kubenswrapper[4870]: I0130 09:29:25.249283 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:29:25 crc kubenswrapper[4870]: I0130 09:29:25.249926 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.251154 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.251934 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.252014 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.253215 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.253318 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af" gracePeriod=600 Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.795621 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af" exitCode=0 Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.795700 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af"} Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.796134 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40"} Jan 30 09:29:55 crc kubenswrapper[4870]: I0130 09:29:55.796161 4870 scope.go:117] "RemoveContainer" containerID="a0995ec52533f007b3aef607a4954202a0ae3e897240f345b324b987bbd17f27" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.158788 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf"] Jan 30 09:30:00 crc kubenswrapper[4870]: E0130 09:30:00.159665 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="extract-content" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.159681 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="extract-content" Jan 30 09:30:00 crc kubenswrapper[4870]: E0130 09:30:00.159717 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="registry-server" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.159726 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="registry-server" Jan 30 09:30:00 crc kubenswrapper[4870]: E0130 09:30:00.159763 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="extract-utilities" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.159774 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="extract-utilities" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.160038 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cda615d-6f79-49fc-812e-28f590544089" containerName="registry-server" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.160931 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.163486 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.163710 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.176091 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf"] Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.220424 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9j52\" (UniqueName: \"kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.220692 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.220895 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.322799 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9j52\" (UniqueName: \"kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.323810 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.324868 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.325064 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.333503 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.352950 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9j52\" (UniqueName: \"kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52\") pod \"collect-profiles-29496090-qlbqf\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:00 crc kubenswrapper[4870]: I0130 09:30:00.500535 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:01 crc kubenswrapper[4870]: I0130 09:30:01.126860 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf"] Jan 30 09:30:01 crc kubenswrapper[4870]: I0130 09:30:01.929667 4870 generic.go:334] "Generic (PLEG): container finished" podID="7370c9a2-2978-4149-8f1f-2c3686a18809" containerID="d6806ea927b023231720616670fecd59d2c946289563c3ab71cac04e0e274c7f" exitCode=0 Jan 30 09:30:01 crc kubenswrapper[4870]: I0130 09:30:01.929722 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" event={"ID":"7370c9a2-2978-4149-8f1f-2c3686a18809","Type":"ContainerDied","Data":"d6806ea927b023231720616670fecd59d2c946289563c3ab71cac04e0e274c7f"} Jan 30 09:30:01 crc kubenswrapper[4870]: I0130 09:30:01.929756 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" event={"ID":"7370c9a2-2978-4149-8f1f-2c3686a18809","Type":"ContainerStarted","Data":"80cd502957099cb35bb72e89cba48705edf68362d02c1d28e48cacf34c0c3dbf"} Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.347838 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.456062 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume\") pod \"7370c9a2-2978-4149-8f1f-2c3686a18809\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.456194 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume\") pod \"7370c9a2-2978-4149-8f1f-2c3686a18809\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.456307 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9j52\" (UniqueName: \"kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52\") pod \"7370c9a2-2978-4149-8f1f-2c3686a18809\" (UID: \"7370c9a2-2978-4149-8f1f-2c3686a18809\") " Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.457305 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume" (OuterVolumeSpecName: "config-volume") pod "7370c9a2-2978-4149-8f1f-2c3686a18809" (UID: "7370c9a2-2978-4149-8f1f-2c3686a18809"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.462253 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7370c9a2-2978-4149-8f1f-2c3686a18809" (UID: "7370c9a2-2978-4149-8f1f-2c3686a18809"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.462274 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52" (OuterVolumeSpecName: "kube-api-access-z9j52") pod "7370c9a2-2978-4149-8f1f-2c3686a18809" (UID: "7370c9a2-2978-4149-8f1f-2c3686a18809"). InnerVolumeSpecName "kube-api-access-z9j52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.558732 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7370c9a2-2978-4149-8f1f-2c3686a18809-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.558787 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7370c9a2-2978-4149-8f1f-2c3686a18809-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.558808 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9j52\" (UniqueName: \"kubernetes.io/projected/7370c9a2-2978-4149-8f1f-2c3686a18809-kube-api-access-z9j52\") on node \"crc\" DevicePath \"\"" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.948186 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" event={"ID":"7370c9a2-2978-4149-8f1f-2c3686a18809","Type":"ContainerDied","Data":"80cd502957099cb35bb72e89cba48705edf68362d02c1d28e48cacf34c0c3dbf"} Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.948229 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496090-qlbqf" Jan 30 09:30:03 crc kubenswrapper[4870]: I0130 09:30:03.948232 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80cd502957099cb35bb72e89cba48705edf68362d02c1d28e48cacf34c0c3dbf" Jan 30 09:30:04 crc kubenswrapper[4870]: I0130 09:30:04.425865 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj"] Jan 30 09:30:04 crc kubenswrapper[4870]: I0130 09:30:04.435567 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496045-wr5sj"] Jan 30 09:30:06 crc kubenswrapper[4870]: I0130 09:30:06.086015 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c91153-7a90-4c60-811f-915f8ccf0bdf" path="/var/lib/kubelet/pods/e9c91153-7a90-4c60-811f-915f8ccf0bdf/volumes" Jan 30 09:30:39 crc kubenswrapper[4870]: I0130 09:30:39.305015 4870 scope.go:117] "RemoveContainer" containerID="479ba30159faf1bd5abe17d0fd8bcbe0c86c787b6f5f69ef68ac1e6330cdb3a2" Jan 30 09:31:55 crc kubenswrapper[4870]: I0130 09:31:55.250157 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:31:55 crc kubenswrapper[4870]: I0130 09:31:55.250710 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:32:25 crc kubenswrapper[4870]: I0130 09:32:25.250342 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:32:25 crc kubenswrapper[4870]: I0130 09:32:25.250929 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.250247 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.250886 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.250950 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.251846 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.252027 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" gracePeriod=600 Jan 30 09:32:55 crc kubenswrapper[4870]: E0130 09:32:55.386545 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.580077 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" exitCode=0 Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.580134 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40"} Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.580213 4870 scope.go:117] "RemoveContainer" containerID="795d2451707be8d2499e0a626a854aa75de48e7e4bf87e79653cebe4102927af" Jan 30 09:32:55 crc kubenswrapper[4870]: I0130 09:32:55.581235 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:32:55 crc kubenswrapper[4870]: E0130 09:32:55.581594 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.020227 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:06 crc kubenswrapper[4870]: E0130 09:33:06.021301 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7370c9a2-2978-4149-8f1f-2c3686a18809" containerName="collect-profiles" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.021314 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="7370c9a2-2978-4149-8f1f-2c3686a18809" containerName="collect-profiles" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.021501 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="7370c9a2-2978-4149-8f1f-2c3686a18809" containerName="collect-profiles" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.022983 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.042757 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.170399 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.170761 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbwz\" (UniqueName: \"kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.170905 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.278131 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.278247 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbwz\" (UniqueName: \"kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.278353 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.279183 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.279328 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.311794 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbwz\" (UniqueName: \"kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz\") pod \"redhat-marketplace-wcf6s\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.345588 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:06 crc kubenswrapper[4870]: I0130 09:33:06.848821 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:07 crc kubenswrapper[4870]: I0130 09:33:07.074431 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:33:07 crc kubenswrapper[4870]: E0130 09:33:07.075355 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:33:07 crc kubenswrapper[4870]: I0130 09:33:07.701330 4870 generic.go:334] "Generic (PLEG): container finished" podID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerID="e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae" exitCode=0 Jan 30 09:33:07 crc kubenswrapper[4870]: I0130 09:33:07.701420 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerDied","Data":"e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae"} Jan 30 09:33:07 crc kubenswrapper[4870]: I0130 09:33:07.701611 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerStarted","Data":"3f23315f274408f80fb768e32c0b22194e01f4d4da149a314da0ca738103e0a6"} Jan 30 09:33:07 crc kubenswrapper[4870]: I0130 09:33:07.704336 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:33:08 crc kubenswrapper[4870]: I0130 09:33:08.711113 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerStarted","Data":"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add"} Jan 30 09:33:09 crc kubenswrapper[4870]: I0130 09:33:09.723731 4870 generic.go:334] "Generic (PLEG): container finished" podID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerID="9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add" exitCode=0 Jan 30 09:33:09 crc kubenswrapper[4870]: I0130 09:33:09.724028 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerDied","Data":"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add"} Jan 30 09:33:10 crc kubenswrapper[4870]: I0130 09:33:10.738281 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerStarted","Data":"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d"} Jan 30 09:33:10 crc kubenswrapper[4870]: I0130 09:33:10.787108 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcf6s" podStartSLOduration=3.374033601 podStartE2EDuration="5.787085388s" podCreationTimestamp="2026-01-30 09:33:05 +0000 UTC" firstStartedPulling="2026-01-30 09:33:07.70404575 +0000 UTC m=+5026.399592869" lastFinishedPulling="2026-01-30 09:33:10.117097547 +0000 UTC m=+5028.812644656" observedRunningTime="2026-01-30 09:33:10.776364503 +0000 UTC m=+5029.471911672" watchObservedRunningTime="2026-01-30 09:33:10.787085388 +0000 UTC m=+5029.482632497" Jan 30 09:33:16 crc kubenswrapper[4870]: I0130 09:33:16.346912 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:16 crc kubenswrapper[4870]: I0130 09:33:16.347313 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:16 crc kubenswrapper[4870]: I0130 09:33:16.399251 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:16 crc kubenswrapper[4870]: I0130 09:33:16.848892 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:16 crc kubenswrapper[4870]: I0130 09:33:16.909319 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:18 crc kubenswrapper[4870]: I0130 09:33:18.811403 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcf6s" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="registry-server" containerID="cri-o://ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d" gracePeriod=2 Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.074994 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:33:19 crc kubenswrapper[4870]: E0130 09:33:19.075464 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.367286 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.473517 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content\") pod \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.473632 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mbwz\" (UniqueName: \"kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz\") pod \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.473762 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities\") pod \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\" (UID: \"99d2fe89-b1ad-4202-81d1-6565aca3e0cf\") " Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.475130 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities" (OuterVolumeSpecName: "utilities") pod "99d2fe89-b1ad-4202-81d1-6565aca3e0cf" (UID: "99d2fe89-b1ad-4202-81d1-6565aca3e0cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.499719 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz" (OuterVolumeSpecName: "kube-api-access-7mbwz") pod "99d2fe89-b1ad-4202-81d1-6565aca3e0cf" (UID: "99d2fe89-b1ad-4202-81d1-6565aca3e0cf"). InnerVolumeSpecName "kube-api-access-7mbwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.512244 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99d2fe89-b1ad-4202-81d1-6565aca3e0cf" (UID: "99d2fe89-b1ad-4202-81d1-6565aca3e0cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.576758 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.576799 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.576815 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mbwz\" (UniqueName: \"kubernetes.io/projected/99d2fe89-b1ad-4202-81d1-6565aca3e0cf-kube-api-access-7mbwz\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.847449 4870 generic.go:334] "Generic (PLEG): container finished" podID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerID="ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d" exitCode=0 Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.847518 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerDied","Data":"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d"} Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.847557 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcf6s" event={"ID":"99d2fe89-b1ad-4202-81d1-6565aca3e0cf","Type":"ContainerDied","Data":"3f23315f274408f80fb768e32c0b22194e01f4d4da149a314da0ca738103e0a6"} Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.847596 4870 scope.go:117] "RemoveContainer" containerID="ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.847842 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcf6s" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.891564 4870 scope.go:117] "RemoveContainer" containerID="9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.905831 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.916405 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcf6s"] Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.924094 4870 scope.go:117] "RemoveContainer" containerID="e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.984127 4870 scope.go:117] "RemoveContainer" containerID="ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d" Jan 30 09:33:19 crc kubenswrapper[4870]: E0130 09:33:19.984552 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d\": container with ID starting with ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d not found: ID does not exist" containerID="ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.984587 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d"} err="failed to get container status \"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d\": rpc error: code = NotFound desc = could not find container \"ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d\": container with ID starting with ba156291c9638d3baa958e1c97ef9cfd36f5d21f21cdca69c41e38e0a513809d not found: ID does not exist" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.984611 4870 scope.go:117] "RemoveContainer" containerID="9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add" Jan 30 09:33:19 crc kubenswrapper[4870]: E0130 09:33:19.984973 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add\": container with ID starting with 9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add not found: ID does not exist" containerID="9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.985075 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add"} err="failed to get container status \"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add\": rpc error: code = NotFound desc = could not find container \"9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add\": container with ID starting with 9f570762efcb27f98a8655a98226de2578ec425e08bb59eec1f4c19543e74add not found: ID does not exist" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.985144 4870 scope.go:117] "RemoveContainer" containerID="e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae" Jan 30 09:33:19 crc kubenswrapper[4870]: E0130 09:33:19.985453 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae\": container with ID starting with e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae not found: ID does not exist" containerID="e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae" Jan 30 09:33:19 crc kubenswrapper[4870]: I0130 09:33:19.985478 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae"} err="failed to get container status \"e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae\": rpc error: code = NotFound desc = could not find container \"e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae\": container with ID starting with e560180dbb8a0d513ce90ea73f6871e8d968c0b8877f802a39a5bf86ec4eb0ae not found: ID does not exist" Jan 30 09:33:20 crc kubenswrapper[4870]: I0130 09:33:20.086519 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" path="/var/lib/kubelet/pods/99d2fe89-b1ad-4202-81d1-6565aca3e0cf/volumes" Jan 30 09:33:22 crc kubenswrapper[4870]: I0130 09:33:22.872993 4870 generic.go:334] "Generic (PLEG): container finished" podID="dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" containerID="e510327daa135710d56632aefcbd974a031585074a72c0b411cbaf1ee33eb7a9" exitCode=1 Jan 30 09:33:22 crc kubenswrapper[4870]: I0130 09:33:22.873037 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a","Type":"ContainerDied","Data":"e510327daa135710d56632aefcbd974a031585074a72c0b411cbaf1ee33eb7a9"} Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.224322 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.281805 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.281888 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.281918 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.281939 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.281989 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.282021 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.282184 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.282285 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bggh\" (UniqueName: \"kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.282302 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key\") pod \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\" (UID: \"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a\") " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.284153 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data" (OuterVolumeSpecName: "config-data") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.286280 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.289918 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.384475 4870 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.384515 4870 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.384529 4870 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.799340 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.799841 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh" (OuterVolumeSpecName: "kube-api-access-8bggh") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "kube-api-access-8bggh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.897434 4870 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.897471 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bggh\" (UniqueName: \"kubernetes.io/projected/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-kube-api-access-8bggh\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.910807 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"dc531a0b-3bc8-45c0-935d-6425c9ee5e3a","Type":"ContainerDied","Data":"1c881927627a156ba1416d85da9f209f5ec355b05e5dce2ac4e41aa800f2573b"} Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.910859 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c881927627a156ba1416d85da9f209f5ec355b05e5dce2ac4e41aa800f2573b" Jan 30 09:33:24 crc kubenswrapper[4870]: I0130 09:33:24.910922 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.025378 4870 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.026682 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.039413 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.086048 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.097711 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" (UID: "dc531a0b-3bc8-45c0-935d-6425c9ee5e3a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.122478 4870 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.122642 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.122766 4870 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.122951 4870 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc531a0b-3bc8-45c0-935d-6425c9ee5e3a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:25 crc kubenswrapper[4870]: I0130 09:33:25.123067 4870 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.757061 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 09:33:29 crc kubenswrapper[4870]: E0130 09:33:29.758167 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" containerName="tempest-tests-tempest-tests-runner" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758182 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" containerName="tempest-tests-tempest-tests-runner" Jan 30 09:33:29 crc kubenswrapper[4870]: E0130 09:33:29.758203 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="extract-content" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758210 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="extract-content" Jan 30 09:33:29 crc kubenswrapper[4870]: E0130 09:33:29.758232 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="extract-utilities" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758240 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="extract-utilities" Jan 30 09:33:29 crc kubenswrapper[4870]: E0130 09:33:29.758255 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="registry-server" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758263 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="registry-server" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758487 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc531a0b-3bc8-45c0-935d-6425c9ee5e3a" containerName="tempest-tests-tempest-tests-runner" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.758509 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d2fe89-b1ad-4202-81d1-6565aca3e0cf" containerName="registry-server" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.759505 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.765373 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w7v26" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.772645 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.825849 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.826088 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs6r5\" (UniqueName: \"kubernetes.io/projected/ca368ef3-843d-4326-a899-9f4a1f6466c3-kube-api-access-xs6r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.928054 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs6r5\" (UniqueName: \"kubernetes.io/projected/ca368ef3-843d-4326-a899-9f4a1f6466c3-kube-api-access-xs6r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.928152 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.928799 4870 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.948957 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs6r5\" (UniqueName: \"kubernetes.io/projected/ca368ef3-843d-4326-a899-9f4a1f6466c3-kube-api-access-xs6r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:29 crc kubenswrapper[4870]: I0130 09:33:29.968860 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ca368ef3-843d-4326-a899-9f4a1f6466c3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:30 crc kubenswrapper[4870]: I0130 09:33:30.080585 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 09:33:30 crc kubenswrapper[4870]: I0130 09:33:30.537904 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 09:33:30 crc kubenswrapper[4870]: I0130 09:33:30.969162 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ca368ef3-843d-4326-a899-9f4a1f6466c3","Type":"ContainerStarted","Data":"85118fa6a8ee5d5813183f40be30b325e6a66007bd7394e610390adc3ed79761"} Jan 30 09:33:31 crc kubenswrapper[4870]: I0130 09:33:31.978721 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ca368ef3-843d-4326-a899-9f4a1f6466c3","Type":"ContainerStarted","Data":"4c979379ab0a5d4ef1cfcc71e90ff80e7d593cf8b7ab0c153fa21e4a25e4bd34"} Jan 30 09:33:31 crc kubenswrapper[4870]: I0130 09:33:31.994915 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.042209956 podStartE2EDuration="2.994893875s" podCreationTimestamp="2026-01-30 09:33:29 +0000 UTC" firstStartedPulling="2026-01-30 09:33:30.541348772 +0000 UTC m=+5049.236895891" lastFinishedPulling="2026-01-30 09:33:31.494032701 +0000 UTC m=+5050.189579810" observedRunningTime="2026-01-30 09:33:31.992919184 +0000 UTC m=+5050.688466293" watchObservedRunningTime="2026-01-30 09:33:31.994893875 +0000 UTC m=+5050.690440984" Jan 30 09:33:34 crc kubenswrapper[4870]: I0130 09:33:34.074657 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:33:34 crc kubenswrapper[4870]: E0130 09:33:34.075619 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:33:46 crc kubenswrapper[4870]: I0130 09:33:46.075322 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:33:46 crc kubenswrapper[4870]: E0130 09:33:46.076208 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:34:01 crc kubenswrapper[4870]: I0130 09:34:01.075181 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:34:01 crc kubenswrapper[4870]: E0130 09:34:01.076090 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:34:15 crc kubenswrapper[4870]: I0130 09:34:15.074605 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:34:15 crc kubenswrapper[4870]: E0130 09:34:15.075539 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.278199 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngvkt/must-gather-jl6kn"] Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.280560 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.282771 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ngvkt"/"kube-root-ca.crt" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.283066 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ngvkt"/"openshift-service-ca.crt" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.283737 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ngvkt"/"default-dockercfg-4wdwf" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.297955 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngvkt/must-gather-jl6kn"] Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.316762 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.316832 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltpzc\" (UniqueName: \"kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.418633 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltpzc\" (UniqueName: \"kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.418908 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.419362 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.441799 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltpzc\" (UniqueName: \"kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc\") pod \"must-gather-jl6kn\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:22 crc kubenswrapper[4870]: I0130 09:34:22.605514 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:34:23 crc kubenswrapper[4870]: I0130 09:34:23.208164 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngvkt/must-gather-jl6kn"] Jan 30 09:34:23 crc kubenswrapper[4870]: I0130 09:34:23.479870 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" event={"ID":"6769b74f-20a7-48a8-b39b-d812418dbab4","Type":"ContainerStarted","Data":"db30a1c0b10d50233163a592b08ff9db56770b71ea9dff19a11314adcb837750"} Jan 30 09:34:26 crc kubenswrapper[4870]: I0130 09:34:26.074708 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:34:26 crc kubenswrapper[4870]: E0130 09:34:26.075545 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:34:29 crc kubenswrapper[4870]: I0130 09:34:29.535799 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" event={"ID":"6769b74f-20a7-48a8-b39b-d812418dbab4","Type":"ContainerStarted","Data":"94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868"} Jan 30 09:34:29 crc kubenswrapper[4870]: I0130 09:34:29.536207 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" event={"ID":"6769b74f-20a7-48a8-b39b-d812418dbab4","Type":"ContainerStarted","Data":"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879"} Jan 30 09:34:29 crc kubenswrapper[4870]: I0130 09:34:29.555304 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" podStartSLOduration=1.7722927689999999 podStartE2EDuration="7.555280891s" podCreationTimestamp="2026-01-30 09:34:22 +0000 UTC" firstStartedPulling="2026-01-30 09:34:23.225369438 +0000 UTC m=+5101.920916547" lastFinishedPulling="2026-01-30 09:34:29.00835756 +0000 UTC m=+5107.703904669" observedRunningTime="2026-01-30 09:34:29.549361187 +0000 UTC m=+5108.244908296" watchObservedRunningTime="2026-01-30 09:34:29.555280891 +0000 UTC m=+5108.250828000" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.229220 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-95d7g"] Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.230920 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.286343 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.286458 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt22h\" (UniqueName: \"kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.388257 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.388320 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.388589 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt22h\" (UniqueName: \"kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.498829 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt22h\" (UniqueName: \"kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h\") pod \"crc-debug-95d7g\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:34 crc kubenswrapper[4870]: I0130 09:34:34.553687 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:34:35 crc kubenswrapper[4870]: I0130 09:34:35.593038 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" event={"ID":"e8c90b48-a96e-4c40-aff8-ed26b5d74540","Type":"ContainerStarted","Data":"b44d4efd6de06f1f27935332895224e30f5802b5af4349b9627d9aac0d98adc1"} Jan 30 09:34:39 crc kubenswrapper[4870]: I0130 09:34:39.075508 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:34:39 crc kubenswrapper[4870]: E0130 09:34:39.076148 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:34:46 crc kubenswrapper[4870]: I0130 09:34:46.007235 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" event={"ID":"e8c90b48-a96e-4c40-aff8-ed26b5d74540","Type":"ContainerStarted","Data":"c10769db3e913e00355f8966e729b5b6b9071d652adf30ce568e54dc81b0dfbf"} Jan 30 09:34:46 crc kubenswrapper[4870]: I0130 09:34:46.025918 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" podStartSLOduration=1.5754330539999999 podStartE2EDuration="12.025896102s" podCreationTimestamp="2026-01-30 09:34:34 +0000 UTC" firstStartedPulling="2026-01-30 09:34:34.591397416 +0000 UTC m=+5113.286944525" lastFinishedPulling="2026-01-30 09:34:45.041860474 +0000 UTC m=+5123.737407573" observedRunningTime="2026-01-30 09:34:46.019394679 +0000 UTC m=+5124.714941788" watchObservedRunningTime="2026-01-30 09:34:46.025896102 +0000 UTC m=+5124.721443211" Jan 30 09:34:51 crc kubenswrapper[4870]: I0130 09:34:51.075777 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:34:51 crc kubenswrapper[4870]: E0130 09:34:51.078575 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:35:04 crc kubenswrapper[4870]: I0130 09:35:04.075273 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:35:04 crc kubenswrapper[4870]: E0130 09:35:04.076117 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:35:18 crc kubenswrapper[4870]: I0130 09:35:18.074726 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:35:18 crc kubenswrapper[4870]: E0130 09:35:18.075487 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:35:33 crc kubenswrapper[4870]: I0130 09:35:33.075497 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:35:33 crc kubenswrapper[4870]: E0130 09:35:33.078226 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:35:37 crc kubenswrapper[4870]: I0130 09:35:37.532633 4870 generic.go:334] "Generic (PLEG): container finished" podID="e8c90b48-a96e-4c40-aff8-ed26b5d74540" containerID="c10769db3e913e00355f8966e729b5b6b9071d652adf30ce568e54dc81b0dfbf" exitCode=0 Jan 30 09:35:37 crc kubenswrapper[4870]: I0130 09:35:37.532723 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" event={"ID":"e8c90b48-a96e-4c40-aff8-ed26b5d74540","Type":"ContainerDied","Data":"c10769db3e913e00355f8966e729b5b6b9071d652adf30ce568e54dc81b0dfbf"} Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.673459 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.715509 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-95d7g"] Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.723760 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-95d7g"] Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.825669 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt22h\" (UniqueName: \"kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h\") pod \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.825778 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host\") pod \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\" (UID: \"e8c90b48-a96e-4c40-aff8-ed26b5d74540\") " Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.825943 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host" (OuterVolumeSpecName: "host") pod "e8c90b48-a96e-4c40-aff8-ed26b5d74540" (UID: "e8c90b48-a96e-4c40-aff8-ed26b5d74540"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.826475 4870 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8c90b48-a96e-4c40-aff8-ed26b5d74540-host\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.836922 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h" (OuterVolumeSpecName: "kube-api-access-xt22h") pod "e8c90b48-a96e-4c40-aff8-ed26b5d74540" (UID: "e8c90b48-a96e-4c40-aff8-ed26b5d74540"). InnerVolumeSpecName "kube-api-access-xt22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:35:38 crc kubenswrapper[4870]: I0130 09:35:38.928548 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt22h\" (UniqueName: \"kubernetes.io/projected/e8c90b48-a96e-4c40-aff8-ed26b5d74540-kube-api-access-xt22h\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.562638 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b44d4efd6de06f1f27935332895224e30f5802b5af4349b9627d9aac0d98adc1" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.562715 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-95d7g" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.908832 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-gqfg9"] Jan 30 09:35:39 crc kubenswrapper[4870]: E0130 09:35:39.909685 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c90b48-a96e-4c40-aff8-ed26b5d74540" containerName="container-00" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.909703 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c90b48-a96e-4c40-aff8-ed26b5d74540" containerName="container-00" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.909993 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c90b48-a96e-4c40-aff8-ed26b5d74540" containerName="container-00" Jan 30 09:35:39 crc kubenswrapper[4870]: I0130 09:35:39.910949 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.059028 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.059178 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtc5p\" (UniqueName: \"kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.086210 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c90b48-a96e-4c40-aff8-ed26b5d74540" path="/var/lib/kubelet/pods/e8c90b48-a96e-4c40-aff8-ed26b5d74540/volumes" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.161544 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.161681 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtc5p\" (UniqueName: \"kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.161685 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.178079 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtc5p\" (UniqueName: \"kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p\") pod \"crc-debug-gqfg9\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.229634 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:40 crc kubenswrapper[4870]: I0130 09:35:40.572031 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" event={"ID":"ee69b9db-1ce7-4877-8e0a-f44a22b61917","Type":"ContainerStarted","Data":"0d15bc4f5f1cdd64af25a4f04a8ea84d36babcf03d847e6f57d65c94e43dffba"} Jan 30 09:35:41 crc kubenswrapper[4870]: I0130 09:35:41.583236 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" event={"ID":"ee69b9db-1ce7-4877-8e0a-f44a22b61917","Type":"ContainerStarted","Data":"59efa965c2adfae979420af1ce17d266ff44eee37b7a015447ef2d517758282c"} Jan 30 09:35:41 crc kubenswrapper[4870]: I0130 09:35:41.598118 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" podStartSLOduration=2.598102275 podStartE2EDuration="2.598102275s" podCreationTimestamp="2026-01-30 09:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 09:35:41.594789883 +0000 UTC m=+5180.290336992" watchObservedRunningTime="2026-01-30 09:35:41.598102275 +0000 UTC m=+5180.293649384" Jan 30 09:35:42 crc kubenswrapper[4870]: I0130 09:35:42.596696 4870 generic.go:334] "Generic (PLEG): container finished" podID="ee69b9db-1ce7-4877-8e0a-f44a22b61917" containerID="59efa965c2adfae979420af1ce17d266ff44eee37b7a015447ef2d517758282c" exitCode=0 Jan 30 09:35:42 crc kubenswrapper[4870]: I0130 09:35:42.596751 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" event={"ID":"ee69b9db-1ce7-4877-8e0a-f44a22b61917","Type":"ContainerDied","Data":"59efa965c2adfae979420af1ce17d266ff44eee37b7a015447ef2d517758282c"} Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.775758 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.944885 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtc5p\" (UniqueName: \"kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p\") pod \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.944939 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host\") pod \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\" (UID: \"ee69b9db-1ce7-4877-8e0a-f44a22b61917\") " Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.945152 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host" (OuterVolumeSpecName: "host") pod "ee69b9db-1ce7-4877-8e0a-f44a22b61917" (UID: "ee69b9db-1ce7-4877-8e0a-f44a22b61917"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.945675 4870 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee69b9db-1ce7-4877-8e0a-f44a22b61917-host\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.956705 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p" (OuterVolumeSpecName: "kube-api-access-gtc5p") pod "ee69b9db-1ce7-4877-8e0a-f44a22b61917" (UID: "ee69b9db-1ce7-4877-8e0a-f44a22b61917"). InnerVolumeSpecName "kube-api-access-gtc5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.985845 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-gqfg9"] Jan 30 09:35:43 crc kubenswrapper[4870]: I0130 09:35:43.999689 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-gqfg9"] Jan 30 09:35:44 crc kubenswrapper[4870]: I0130 09:35:44.048028 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtc5p\" (UniqueName: \"kubernetes.io/projected/ee69b9db-1ce7-4877-8e0a-f44a22b61917-kube-api-access-gtc5p\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:44 crc kubenswrapper[4870]: I0130 09:35:44.085815 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee69b9db-1ce7-4877-8e0a-f44a22b61917" path="/var/lib/kubelet/pods/ee69b9db-1ce7-4877-8e0a-f44a22b61917/volumes" Jan 30 09:35:44 crc kubenswrapper[4870]: I0130 09:35:44.664683 4870 scope.go:117] "RemoveContainer" containerID="59efa965c2adfae979420af1ce17d266ff44eee37b7a015447ef2d517758282c" Jan 30 09:35:44 crc kubenswrapper[4870]: I0130 09:35:44.664738 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-gqfg9" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.198643 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-g52p8"] Jan 30 09:35:45 crc kubenswrapper[4870]: E0130 09:35:45.200410 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee69b9db-1ce7-4877-8e0a-f44a22b61917" containerName="container-00" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.200437 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee69b9db-1ce7-4877-8e0a-f44a22b61917" containerName="container-00" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.201035 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee69b9db-1ce7-4877-8e0a-f44a22b61917" containerName="container-00" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.202434 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.383118 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.383323 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56nxq\" (UniqueName: \"kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.484930 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56nxq\" (UniqueName: \"kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.485127 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.485321 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.507698 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56nxq\" (UniqueName: \"kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq\") pod \"crc-debug-g52p8\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.528032 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:45 crc kubenswrapper[4870]: I0130 09:35:45.683218 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" event={"ID":"384e367c-c2a4-4dbf-bb60-a903590c8ead","Type":"ContainerStarted","Data":"a95981cbb1ed15aeacc2b4b511f205c4253ce1a27ef8212031933dad38699908"} Jan 30 09:35:46 crc kubenswrapper[4870]: I0130 09:35:46.694388 4870 generic.go:334] "Generic (PLEG): container finished" podID="384e367c-c2a4-4dbf-bb60-a903590c8ead" containerID="5c29178d54a5c60a34db9756b29000aea4dcd5f164a6b57720fc7f6a4eda55cd" exitCode=0 Jan 30 09:35:46 crc kubenswrapper[4870]: I0130 09:35:46.694485 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" event={"ID":"384e367c-c2a4-4dbf-bb60-a903590c8ead","Type":"ContainerDied","Data":"5c29178d54a5c60a34db9756b29000aea4dcd5f164a6b57720fc7f6a4eda55cd"} Jan 30 09:35:46 crc kubenswrapper[4870]: I0130 09:35:46.728810 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-g52p8"] Jan 30 09:35:46 crc kubenswrapper[4870]: I0130 09:35:46.738351 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ngvkt/crc-debug-g52p8"] Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.829177 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.940857 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host\") pod \"384e367c-c2a4-4dbf-bb60-a903590c8ead\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.941005 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56nxq\" (UniqueName: \"kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq\") pod \"384e367c-c2a4-4dbf-bb60-a903590c8ead\" (UID: \"384e367c-c2a4-4dbf-bb60-a903590c8ead\") " Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.941003 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host" (OuterVolumeSpecName: "host") pod "384e367c-c2a4-4dbf-bb60-a903590c8ead" (UID: "384e367c-c2a4-4dbf-bb60-a903590c8ead"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.941546 4870 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/384e367c-c2a4-4dbf-bb60-a903590c8ead-host\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:47 crc kubenswrapper[4870]: I0130 09:35:47.948221 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq" (OuterVolumeSpecName: "kube-api-access-56nxq") pod "384e367c-c2a4-4dbf-bb60-a903590c8ead" (UID: "384e367c-c2a4-4dbf-bb60-a903590c8ead"). InnerVolumeSpecName "kube-api-access-56nxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:35:48 crc kubenswrapper[4870]: I0130 09:35:48.043423 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56nxq\" (UniqueName: \"kubernetes.io/projected/384e367c-c2a4-4dbf-bb60-a903590c8ead-kube-api-access-56nxq\") on node \"crc\" DevicePath \"\"" Jan 30 09:35:48 crc kubenswrapper[4870]: I0130 09:35:48.075212 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:35:48 crc kubenswrapper[4870]: E0130 09:35:48.075475 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:35:48 crc kubenswrapper[4870]: I0130 09:35:48.085331 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384e367c-c2a4-4dbf-bb60-a903590c8ead" path="/var/lib/kubelet/pods/384e367c-c2a4-4dbf-bb60-a903590c8ead/volumes" Jan 30 09:35:48 crc kubenswrapper[4870]: I0130 09:35:48.716340 4870 scope.go:117] "RemoveContainer" containerID="5c29178d54a5c60a34db9756b29000aea4dcd5f164a6b57720fc7f6a4eda55cd" Jan 30 09:35:48 crc kubenswrapper[4870]: I0130 09:35:48.716360 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/crc-debug-g52p8" Jan 30 09:36:02 crc kubenswrapper[4870]: I0130 09:36:02.083525 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:36:02 crc kubenswrapper[4870]: E0130 09:36:02.084218 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:36:15 crc kubenswrapper[4870]: I0130 09:36:15.075288 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:36:15 crc kubenswrapper[4870]: E0130 09:36:15.076800 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.350413 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5564cc7ccb-wnwrs_304a486b-b7cf-4418-82c9-7795b2331284/barbican-api/0.log" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.525438 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5564cc7ccb-wnwrs_304a486b-b7cf-4418-82c9-7795b2331284/barbican-api-log/0.log" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.543096 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54fb8bddb6-w78xn_8a32795f-6328-4d51-a69a-60be965b17f0/barbican-keystone-listener/0.log" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.656680 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-54fb8bddb6-w78xn_8a32795f-6328-4d51-a69a-60be965b17f0/barbican-keystone-listener-log/0.log" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.768903 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b94ff658f-bmntr_a3bc44ff-bc04-4e44-bb13-ff62f43057f5/barbican-worker/0.log" Jan 30 09:36:22 crc kubenswrapper[4870]: I0130 09:36:22.865149 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b94ff658f-bmntr_a3bc44ff-bc04-4e44-bb13-ff62f43057f5/barbican-worker-log/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.003979 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v6v4c_620aba2c-f389-4fc9-a27c-28c937894f7d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.389027 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0944a474-a4a5-4ff7-95cf-cd783c051a16/ceilometer-notification-agent/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.406918 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0944a474-a4a5-4ff7-95cf-cd783c051a16/ceilometer-central-agent/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.417835 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0944a474-a4a5-4ff7-95cf-cd783c051a16/proxy-httpd/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.469318 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0944a474-a4a5-4ff7-95cf-cd783c051a16/sg-core/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.634469 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb916a9-c812-4e35-91d2-a4cc4ef78fc3/cinder-api-log/0.log" Jan 30 09:36:23 crc kubenswrapper[4870]: I0130 09:36:23.992247 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cecf4070-2dd9-496d-bf4d-7f456eb6ed72/probe/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.033084 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dcb916a9-c812-4e35-91d2-a4cc4ef78fc3/cinder-api/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.185058 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cecf4070-2dd9-496d-bf4d-7f456eb6ed72/cinder-backup/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.223652 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7a1bbc0-d212-4a83-bea0-d40c261ddb18/cinder-scheduler/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.358601 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7a1bbc0-d212-4a83-bea0-d40c261ddb18/probe/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.550971 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_06465a52-3f34-45fd-b95e-e679adcb59e6/probe/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.551251 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_06465a52-3f34-45fd-b95e-e679adcb59e6/cinder-volume/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.771005 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_56215e10-017e-4662-92ab-8f25178c0fab/cinder-volume/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.811256 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_56215e10-017e-4662-92ab-8f25178c0fab/probe/0.log" Jan 30 09:36:24 crc kubenswrapper[4870]: I0130 09:36:24.938772 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-948rh_1eea19c9-87be-4160-8c11-c7ecd13cf088/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.143927 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66968b76ff-bk2j6_3f90c906-9b1e-4df6-8b94-367ae01963b7/init/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.233086 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b4t7n_f32f4b01-631a-4f4b-8ffb-f0873b819de0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.417504 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66968b76ff-bk2j6_3f90c906-9b1e-4df6-8b94-367ae01963b7/init/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.480393 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p67q7_9bef3cd3-94ab-486e-91de-c0ede57769d8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.556802 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66968b76ff-bk2j6_3f90c906-9b1e-4df6-8b94-367ae01963b7/dnsmasq-dns/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.713693 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_743b8276-eb2e-49fa-b493-fb83f20837ed/glance-httpd/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.743075 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_743b8276-eb2e-49fa-b493-fb83f20837ed/glance-log/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.906384 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2efb8d24-a358-43df-af27-d74c4cf88e1f/glance-httpd/0.log" Jan 30 09:36:25 crc kubenswrapper[4870]: I0130 09:36:25.923093 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2efb8d24-a358-43df-af27-d74c4cf88e1f/glance-log/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.257169 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-769d7654db-gw44c_b6c9337c-50ce-4c5c-a84f-8092d25fa1e2/horizon/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.284466 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8t9nx_51d5d5e3-867b-4ec9-9fca-07038b83ba29/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.523238 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-z4hkm_82fb960a-335c-4d35-baed-122cd1cb515d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.700451 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-769d7654db-gw44c_b6c9337c-50ce-4c5c-a84f-8092d25fa1e2/horizon-log/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.748958 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496061-tjh7b_43a9af69-f9ef-444e-8505-ccf1eac1a036/keystone-cron/0.log" Jan 30 09:36:26 crc kubenswrapper[4870]: I0130 09:36:26.968959 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0deb54ca-48c2-4b35-88c0-dbad5e8b9272/kube-state-metrics/0.log" Jan 30 09:36:27 crc kubenswrapper[4870]: I0130 09:36:27.025195 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55b585f57f-9h2lg_cb9f4cfa-0698-47dd-9319-47b185d2f937/keystone-api/0.log" Jan 30 09:36:27 crc kubenswrapper[4870]: I0130 09:36:27.129832 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-26lfr_9e214e41-a575-467c-a053-d6807c4f1512/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:27 crc kubenswrapper[4870]: I0130 09:36:27.791666 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d69bf9957-gj6dt_a50dec5c-d013-42b7-8a60-c405d5c93362/neutron-api/0.log" Jan 30 09:36:27 crc kubenswrapper[4870]: I0130 09:36:27.810090 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d69bf9957-gj6dt_a50dec5c-d013-42b7-8a60-c405d5c93362/neutron-httpd/0.log" Jan 30 09:36:27 crc kubenswrapper[4870]: I0130 09:36:27.845964 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5hlg8_bbcba502-7991-4f7b-bdbd-b112cec436b9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:28 crc kubenswrapper[4870]: I0130 09:36:28.080285 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:36:28 crc kubenswrapper[4870]: E0130 09:36:28.080521 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:36:28 crc kubenswrapper[4870]: I0130 09:36:28.490163 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9834ddd4-269a-463c-953c-1bf07a7ffdf0/nova-cell0-conductor-conductor/0.log" Jan 30 09:36:28 crc kubenswrapper[4870]: I0130 09:36:28.901658 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e5686258-ed50-49a1-920b-77e9bbe01c55/nova-cell1-conductor-conductor/0.log" Jan 30 09:36:29 crc kubenswrapper[4870]: I0130 09:36:29.092957 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f6319a2a-594b-4da1-be42-ad0918221515/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 09:36:29 crc kubenswrapper[4870]: I0130 09:36:29.266018 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ed40aa22-a330-46ab-9971-39e764e63ff7/nova-api-log/0.log" Jan 30 09:36:29 crc kubenswrapper[4870]: I0130 09:36:29.402375 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4b7pb_da926ccc-5787-4741-a00c-1163494adb5e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:29 crc kubenswrapper[4870]: I0130 09:36:29.558057 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ccdea203-220a-457e-b00f-61b48afc7329/nova-metadata-log/0.log" Jan 30 09:36:29 crc kubenswrapper[4870]: I0130 09:36:29.641551 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ed40aa22-a330-46ab-9971-39e764e63ff7/nova-api-api/0.log" Jan 30 09:36:30 crc kubenswrapper[4870]: I0130 09:36:30.553276 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31607550-5ccc-4b0b-9fbd-18007a61dcff/mysql-bootstrap/0.log" Jan 30 09:36:30 crc kubenswrapper[4870]: I0130 09:36:30.556373 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ecf0d9a5-3778-453d-ad3a-0d28eb2e71a6/nova-scheduler-scheduler/0.log" Jan 30 09:36:30 crc kubenswrapper[4870]: I0130 09:36:30.836143 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31607550-5ccc-4b0b-9fbd-18007a61dcff/galera/0.log" Jan 30 09:36:30 crc kubenswrapper[4870]: I0130 09:36:30.845795 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31607550-5ccc-4b0b-9fbd-18007a61dcff/mysql-bootstrap/0.log" Jan 30 09:36:31 crc kubenswrapper[4870]: I0130 09:36:31.049056 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a/mysql-bootstrap/0.log" Jan 30 09:36:31 crc kubenswrapper[4870]: I0130 09:36:31.276602 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a/galera/0.log" Jan 30 09:36:31 crc kubenswrapper[4870]: I0130 09:36:31.303698 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2ce45bb8-e721-40bb-a9fb-ac0d6b0deb4a/mysql-bootstrap/0.log" Jan 30 09:36:31 crc kubenswrapper[4870]: I0130 09:36:31.449844 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_204a0d39-f7b0-4468-a82f-9fcc49fc1281/openstackclient/0.log" Jan 30 09:36:31 crc kubenswrapper[4870]: I0130 09:36:31.540728 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-56vf8_eaa9048d-8c54-4054-87d1-69c6746c1479/openstack-network-exporter/0.log" Jan 30 09:36:32 crc kubenswrapper[4870]: I0130 09:36:32.395822 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ccdea203-220a-457e-b00f-61b48afc7329/nova-metadata-metadata/0.log" Jan 30 09:36:32 crc kubenswrapper[4870]: I0130 09:36:32.630749 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gznh8_b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2/ovsdb-server-init/0.log" Jan 30 09:36:32 crc kubenswrapper[4870]: I0130 09:36:32.840637 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gznh8_b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2/ovsdb-server-init/0.log" Jan 30 09:36:32 crc kubenswrapper[4870]: I0130 09:36:32.873862 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gznh8_b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2/ovsdb-server/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.121262 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rwchz_496b707b-8de6-4228-b4fd-a48f3709586c/ovn-controller/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.266206 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gznh8_b3231fb5-8edc-4a0d-a0f2-bf9aeb8e4fe2/ovs-vswitchd/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.464400 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-8z72z_11f380d9-7c41-4b65-a46d-01c14ac81c07/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.525787 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d69aef12-ac48-41f7-8a14-a561edab0ae7/openstack-network-exporter/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.529407 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d69aef12-ac48-41f7-8a14-a561edab0ae7/ovn-northd/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.710279 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e9a5fd23-1240-4284-91cf-b57f4b2e3d02/openstack-network-exporter/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.860498 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e9a5fd23-1240-4284-91cf-b57f4b2e3d02/ovsdbserver-nb/0.log" Jan 30 09:36:33 crc kubenswrapper[4870]: I0130 09:36:33.983140 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_625f2d84-6699-4e9f-881e-e96509760e9d/openstack-network-exporter/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.085177 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_625f2d84-6699-4e9f-881e-e96509760e9d/ovsdbserver-sb/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.406048 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cfc8cc98-pfz9w_a0bafb1e-cef8-4a8c-bb78-a5d11d098691/placement-api/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.422687 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a4d5397-32f0-4cc0-919b-cf4ed004b797/init-config-reloader/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.463437 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56cfc8cc98-pfz9w_a0bafb1e-cef8-4a8c-bb78-a5d11d098691/placement-log/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.644795 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a4d5397-32f0-4cc0-919b-cf4ed004b797/config-reloader/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.717444 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a4d5397-32f0-4cc0-919b-cf4ed004b797/init-config-reloader/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.776613 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a4d5397-32f0-4cc0-919b-cf4ed004b797/prometheus/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.855080 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a4d5397-32f0-4cc0-919b-cf4ed004b797/thanos-sidecar/0.log" Jan 30 09:36:34 crc kubenswrapper[4870]: I0130 09:36:34.969914 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2575ea2c-dc22-4ca2-bf0b-d67eaa330832/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.252033 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2575ea2c-dc22-4ca2-bf0b-d67eaa330832/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.291218 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_2ab884a9-b47a-476a-8f89-140093b96527/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.368081 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_2575ea2c-dc22-4ca2-bf0b-d67eaa330832/rabbitmq/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.579093 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_2ab884a9-b47a-476a-8f89-140093b96527/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.634521 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_2ab884a9-b47a-476a-8f89-140093b96527/rabbitmq/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.674030 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf05f72e-aa42-4296-a7dc-8b742d6e0aab/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.901640 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf05f72e-aa42-4296-a7dc-8b742d6e0aab/setup-container/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.976845 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf05f72e-aa42-4296-a7dc-8b742d6e0aab/rabbitmq/0.log" Jan 30 09:36:35 crc kubenswrapper[4870]: I0130 09:36:35.985535 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2ns29_7c9e0c7d-dc65-4862-99da-326bc8d45bfd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:36 crc kubenswrapper[4870]: I0130 09:36:36.261750 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zdqzm_68089c9f-f566-4e65-b2ea-dd65a4d9012c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:36 crc kubenswrapper[4870]: I0130 09:36:36.269185 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fpd48_c22cad0f-b909-42fa-95c5-2536e1105161/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:36 crc kubenswrapper[4870]: I0130 09:36:36.811144 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zl6bw_a685318c-e23f-4192-8ab4-7dbf24880b0d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:36 crc kubenswrapper[4870]: I0130 09:36:36.884185 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j8w7g_07db545c-df21-4f19-ad37-3071248b8672/ssh-known-hosts-edpm-deployment/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.189655 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847c478677-wtndf_c01b58ab-bb54-448b-83de-f70f08378751/proxy-server/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.268824 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847c478677-wtndf_c01b58ab-bb54-448b-83de-f70f08378751/proxy-httpd/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.411539 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gkrl7_4406e732-41a8-48a1-954a-6dbe4483a79a/swift-ring-rebalance/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.508243 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/account-auditor/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.555908 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/account-reaper/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.734976 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/account-replicator/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.799180 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/container-auditor/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.816853 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/account-server/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.865237 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/container-replicator/0.log" Jan 30 09:36:37 crc kubenswrapper[4870]: I0130 09:36:37.996860 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/container-server/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.019488 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/container-updater/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.105407 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/object-auditor/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.113868 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/object-expirer/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.246620 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/object-replicator/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.283451 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/object-server/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.348927 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/rsync/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.352026 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/object-updater/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.554763 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_46634e41-7d5b-4181-b824-716bb37fca47/swift-recon-cron/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.688988 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pxbfz_1e93cbad-07e7-4073-a577-b666a6901a1d/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:38 crc kubenswrapper[4870]: I0130 09:36:38.995848 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ca368ef3-843d-4326-a899-9f4a1f6466c3/test-operator-logs-container/0.log" Jan 30 09:36:39 crc kubenswrapper[4870]: I0130 09:36:39.074747 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:36:39 crc kubenswrapper[4870]: E0130 09:36:39.075088 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:36:39 crc kubenswrapper[4870]: I0130 09:36:39.246386 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ntrfh_2f708fca-b1a9-432a-acbe-df74341208d2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 09:36:39 crc kubenswrapper[4870]: I0130 09:36:39.252176 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_dc531a0b-3bc8-45c0-935d-6425c9ee5e3a/tempest-tests-tempest-tests-runner/0.log" Jan 30 09:36:39 crc kubenswrapper[4870]: I0130 09:36:39.985139 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_4061e0b3-e3ae-4ef0-a979-6028df77da5c/watcher-applier/0.log" Jan 30 09:36:40 crc kubenswrapper[4870]: I0130 09:36:40.681774 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_964cd6aa-bebd-412e-bd1c-001d151a90e8/watcher-api-log/0.log" Jan 30 09:36:43 crc kubenswrapper[4870]: I0130 09:36:43.865382 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_83b9fe73-9106-4f9b-9272-6f12e3fb8177/watcher-decision-engine/0.log" Jan 30 09:36:44 crc kubenswrapper[4870]: I0130 09:36:44.356368 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_964cd6aa-bebd-412e-bd1c-001d151a90e8/watcher-api/0.log" Jan 30 09:36:46 crc kubenswrapper[4870]: I0130 09:36:46.166432 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d691b652-0077-4709-9e9d-16b87c8d3d3c/memcached/0.log" Jan 30 09:36:54 crc kubenswrapper[4870]: I0130 09:36:54.075045 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:36:54 crc kubenswrapper[4870]: E0130 09:36:54.076634 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:37:05 crc kubenswrapper[4870]: I0130 09:37:05.075181 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:37:05 crc kubenswrapper[4870]: E0130 09:37:05.076531 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.410916 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/util/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.594753 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/util/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.607646 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/pull/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.609870 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/pull/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.809248 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/util/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.810377 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/pull/0.log" Jan 30 09:37:12 crc kubenswrapper[4870]: I0130 09:37:12.830403 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_538103733790cb5f656298b9ceab00deaa6551dd449a0a275a081caf7atkgmw_6f3c7406-d095-434b-a79a-f24373a9b141/extract/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.100355 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-wfpg9_54c01287-d66d-46bc-bbb8-7532263099c5/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.108392 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-hsfpq_e973c5f3-3291-4d4b-85ce-806ef6f83c1a/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.313034 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-grbz8_dfd97388-6e7f-4f4a-9e71-7a32a28b1dcb/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.401791 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-tkrpg_96be73fb-f1fc-4c5c-a643-7b9dcc832ac6/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.566908 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-wlkxq_b9449ead-e087-4895-a88a-8bdfe0835ebd/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.580225 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-hbmf7_925313c0-6800-4a27-814b-887b46cf49ad/manager/0.log" Jan 30 09:37:13 crc kubenswrapper[4870]: I0130 09:37:13.842389 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-5vfrj_5680ceb3-f5ec-4d9e-a313-13564402bff2/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.105517 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-spzcf_46e6ee9b-0d57-4f76-8cf6-e8cea47b5f05/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.143129 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-rhfst_db7aeba5-92f5-4887-9a6a-92d8c57650d2/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.146478 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-j9bdn_5cde6cc5-f427-4349-8c8a-3dce0deac5a9/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.367081 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-59rt2_ea3efedd-cb74-48c7-b246-b188bac37ed4/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.427523 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-2xdfh_0ea209e2-96bf-4919-ad8f-f86de2b78ab1/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.648748 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-cpn6f_604ff246-0f47-4c2c-8940-d76f10dce14e/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.668037 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-4sftq_2ee622d2-acd4-4eec-9fbb-12b5bae7e32f/manager/0.log" Jan 30 09:37:14 crc kubenswrapper[4870]: I0130 09:37:14.832039 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dwj9r8_be7a26e3-9284-4316-bce7-7bc15c9178bd/manager/0.log" Jan 30 09:37:15 crc kubenswrapper[4870]: I0130 09:37:15.065228 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-594f7f44c-vnpnd_b5c8b38a-bdec-4120-9802-5a35815eca01/operator/0.log" Jan 30 09:37:15 crc kubenswrapper[4870]: I0130 09:37:15.321258 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4bccf_c79c7300-5362-40dc-a952-2193e7a6908b/registry-server/0.log" Jan 30 09:37:15 crc kubenswrapper[4870]: I0130 09:37:15.577217 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-t4hbm_ec9257db-1c02-4160-9c89-7df62f2ce602/manager/0.log" Jan 30 09:37:15 crc kubenswrapper[4870]: I0130 09:37:15.598484 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-mx5xp_274d3a56-3caf-4dd2-b122-e3b45a3eec6e/manager/0.log" Jan 30 09:37:15 crc kubenswrapper[4870]: I0130 09:37:15.940213 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-sds6v_b706cc39-6af6-4a91-b2a2-6160148dadae/operator/0.log" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.074337 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:37:16 crc kubenswrapper[4870]: E0130 09:37:16.074783 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.109632 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-497sn_2de7363a-3627-42bb-a58f-7bad2e414192/manager/0.log" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.349315 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-t8ncr_378c24d4-b8c1-4cd2-a85c-8449aa00ad3e/manager/0.log" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.579608 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-bmzrd_0319ce7f-95ab-4abf-9101-bf436cc74bf4/manager/0.log" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.679201 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7b7dd57594-2p68v_d6956410-92c0-40bf-b1c1-a3353ccf1bbc/manager/0.log" Jan 30 09:37:16 crc kubenswrapper[4870]: I0130 09:37:16.743853 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65544cf747-sgxjd_fcdb20a3-7229-48e6-8f12-d1b6a5c892f3/manager/0.log" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.817572 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:26 crc kubenswrapper[4870]: E0130 09:37:26.829895 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384e367c-c2a4-4dbf-bb60-a903590c8ead" containerName="container-00" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.829921 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="384e367c-c2a4-4dbf-bb60-a903590c8ead" containerName="container-00" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.830267 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="384e367c-c2a4-4dbf-bb60-a903590c8ead" containerName="container-00" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.832563 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.832658 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.925948 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.926025 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4ws\" (UniqueName: \"kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:26 crc kubenswrapper[4870]: I0130 09:37:26.926174 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.028671 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.028801 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4ws\" (UniqueName: \"kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.028937 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.029181 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.029474 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.066016 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4ws\" (UniqueName: \"kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws\") pod \"redhat-operators-gwxbl\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.155505 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:27 crc kubenswrapper[4870]: W0130 09:37:27.785437 4870 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6135191_d11b_46b6_9eaf_08a0ffb73387.slice/crio-431c4158a94ca496e82d431c484f0b8025d2bfa0356402d185b46c4ba06c8cf3 WatchSource:0}: Error finding container 431c4158a94ca496e82d431c484f0b8025d2bfa0356402d185b46c4ba06c8cf3: Status 404 returned error can't find the container with id 431c4158a94ca496e82d431c484f0b8025d2bfa0356402d185b46c4ba06c8cf3 Jan 30 09:37:27 crc kubenswrapper[4870]: I0130 09:37:27.787997 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:28 crc kubenswrapper[4870]: I0130 09:37:28.644017 4870 generic.go:334] "Generic (PLEG): container finished" podID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerID="a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b" exitCode=0 Jan 30 09:37:28 crc kubenswrapper[4870]: I0130 09:37:28.644057 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerDied","Data":"a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b"} Jan 30 09:37:28 crc kubenswrapper[4870]: I0130 09:37:28.644346 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerStarted","Data":"431c4158a94ca496e82d431c484f0b8025d2bfa0356402d185b46c4ba06c8cf3"} Jan 30 09:37:29 crc kubenswrapper[4870]: I0130 09:37:29.657294 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerStarted","Data":"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107"} Jan 30 09:37:31 crc kubenswrapper[4870]: I0130 09:37:31.074975 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:37:31 crc kubenswrapper[4870]: E0130 09:37:31.075671 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:37:37 crc kubenswrapper[4870]: I0130 09:37:37.473979 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vzzk7_6f6b1608-a0e3-4b68-ae9b-f8cfeed9cd72/control-plane-machine-set-operator/0.log" Jan 30 09:37:37 crc kubenswrapper[4870]: I0130 09:37:37.474312 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jr94b_042ed63b-a1a9-4072-ae87-71b9fb98280c/kube-rbac-proxy/0.log" Jan 30 09:37:37 crc kubenswrapper[4870]: I0130 09:37:37.569864 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jr94b_042ed63b-a1a9-4072-ae87-71b9fb98280c/machine-api-operator/0.log" Jan 30 09:37:39 crc kubenswrapper[4870]: I0130 09:37:39.775040 4870 generic.go:334] "Generic (PLEG): container finished" podID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerID="c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107" exitCode=0 Jan 30 09:37:39 crc kubenswrapper[4870]: I0130 09:37:39.775129 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerDied","Data":"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107"} Jan 30 09:37:40 crc kubenswrapper[4870]: I0130 09:37:40.790692 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerStarted","Data":"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb"} Jan 30 09:37:40 crc kubenswrapper[4870]: I0130 09:37:40.828283 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gwxbl" podStartSLOduration=3.252498142 podStartE2EDuration="14.828251423s" podCreationTimestamp="2026-01-30 09:37:26 +0000 UTC" firstStartedPulling="2026-01-30 09:37:28.646384803 +0000 UTC m=+5287.341931922" lastFinishedPulling="2026-01-30 09:37:40.222138084 +0000 UTC m=+5298.917685203" observedRunningTime="2026-01-30 09:37:40.8147094 +0000 UTC m=+5299.510256499" watchObservedRunningTime="2026-01-30 09:37:40.828251423 +0000 UTC m=+5299.523798532" Jan 30 09:37:45 crc kubenswrapper[4870]: I0130 09:37:45.076067 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:37:45 crc kubenswrapper[4870]: E0130 09:37:45.077425 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:37:47 crc kubenswrapper[4870]: I0130 09:37:47.156402 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:47 crc kubenswrapper[4870]: I0130 09:37:47.158074 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:47 crc kubenswrapper[4870]: I0130 09:37:47.216230 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:47 crc kubenswrapper[4870]: I0130 09:37:47.922192 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:47 crc kubenswrapper[4870]: I0130 09:37:47.990133 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:49 crc kubenswrapper[4870]: I0130 09:37:49.883132 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gwxbl" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="registry-server" containerID="cri-o://6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb" gracePeriod=2 Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.384627 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.561397 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities\") pod \"b6135191-d11b-46b6-9eaf-08a0ffb73387\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.561690 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4ws\" (UniqueName: \"kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws\") pod \"b6135191-d11b-46b6-9eaf-08a0ffb73387\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.561834 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content\") pod \"b6135191-d11b-46b6-9eaf-08a0ffb73387\" (UID: \"b6135191-d11b-46b6-9eaf-08a0ffb73387\") " Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.562371 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities" (OuterVolumeSpecName: "utilities") pod "b6135191-d11b-46b6-9eaf-08a0ffb73387" (UID: "b6135191-d11b-46b6-9eaf-08a0ffb73387"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.573420 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws" (OuterVolumeSpecName: "kube-api-access-xv4ws") pod "b6135191-d11b-46b6-9eaf-08a0ffb73387" (UID: "b6135191-d11b-46b6-9eaf-08a0ffb73387"). InnerVolumeSpecName "kube-api-access-xv4ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.664185 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.664233 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4ws\" (UniqueName: \"kubernetes.io/projected/b6135191-d11b-46b6-9eaf-08a0ffb73387-kube-api-access-xv4ws\") on node \"crc\" DevicePath \"\"" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.702950 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6135191-d11b-46b6-9eaf-08a0ffb73387" (UID: "b6135191-d11b-46b6-9eaf-08a0ffb73387"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.765736 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6135191-d11b-46b6-9eaf-08a0ffb73387-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.894271 4870 generic.go:334] "Generic (PLEG): container finished" podID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerID="6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb" exitCode=0 Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.894321 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerDied","Data":"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb"} Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.894337 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwxbl" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.894353 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwxbl" event={"ID":"b6135191-d11b-46b6-9eaf-08a0ffb73387","Type":"ContainerDied","Data":"431c4158a94ca496e82d431c484f0b8025d2bfa0356402d185b46c4ba06c8cf3"} Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.894376 4870 scope.go:117] "RemoveContainer" containerID="6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.916837 4870 scope.go:117] "RemoveContainer" containerID="c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.968084 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.970112 4870 scope.go:117] "RemoveContainer" containerID="a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.973690 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gwxbl"] Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.999187 4870 scope.go:117] "RemoveContainer" containerID="6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb" Jan 30 09:37:50 crc kubenswrapper[4870]: E0130 09:37:50.999745 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb\": container with ID starting with 6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb not found: ID does not exist" containerID="6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.999791 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb"} err="failed to get container status \"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb\": rpc error: code = NotFound desc = could not find container \"6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb\": container with ID starting with 6339dc6602b8301690d5998649fb74d13d5b28888836b982f9059db82a0eeffb not found: ID does not exist" Jan 30 09:37:50 crc kubenswrapper[4870]: I0130 09:37:50.999835 4870 scope.go:117] "RemoveContainer" containerID="c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107" Jan 30 09:37:51 crc kubenswrapper[4870]: E0130 09:37:51.000447 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107\": container with ID starting with c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107 not found: ID does not exist" containerID="c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.000502 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107"} err="failed to get container status \"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107\": rpc error: code = NotFound desc = could not find container \"c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107\": container with ID starting with c07b12bf71463e8c5f4fbf6d8677bba531fc8125d7076b8e0b538798477c2107 not found: ID does not exist" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.000539 4870 scope.go:117] "RemoveContainer" containerID="a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b" Jan 30 09:37:51 crc kubenswrapper[4870]: E0130 09:37:51.001003 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b\": container with ID starting with a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b not found: ID does not exist" containerID="a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.001090 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b"} err="failed to get container status \"a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b\": rpc error: code = NotFound desc = could not find container \"a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b\": container with ID starting with a5c124c31129055a8afcb9a374ab7820a1cd94d86d3fd14a6e54ef3129cc493b not found: ID does not exist" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.082725 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:37:51 crc kubenswrapper[4870]: E0130 09:37:51.083304 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="extract-content" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.083330 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="extract-content" Jan 30 09:37:51 crc kubenswrapper[4870]: E0130 09:37:51.083356 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="extract-utilities" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.083365 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="extract-utilities" Jan 30 09:37:51 crc kubenswrapper[4870]: E0130 09:37:51.083410 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="registry-server" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.083427 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="registry-server" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.083668 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" containerName="registry-server" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.085565 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.116159 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.275904 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f79kd\" (UniqueName: \"kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.276292 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.276430 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.379215 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f79kd\" (UniqueName: \"kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.379334 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.379369 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.379908 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.380277 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.401908 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f79kd\" (UniqueName: \"kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd\") pod \"certified-operators-84sjp\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:51 crc kubenswrapper[4870]: I0130 09:37:51.440459 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.049003 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.089467 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6135191-d11b-46b6-9eaf-08a0ffb73387" path="/var/lib/kubelet/pods/b6135191-d11b-46b6-9eaf-08a0ffb73387/volumes" Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.755006 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ltz5g_dfee5a53-cd5a-470f-9327-e614ff6e56b3/cert-manager-controller/0.log" Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.857370 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-2hzbl_4e91c0f0-40df-495c-8758-892355565838/cert-manager-cainjector/0.log" Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.914557 4870 generic.go:334] "Generic (PLEG): container finished" podID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerID="3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e" exitCode=0 Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.914621 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerDied","Data":"3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e"} Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.914651 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerStarted","Data":"317d6101c3669f63b3e2482610f5f0422019c2ced984517b4c9c94f58e751f84"} Jan 30 09:37:52 crc kubenswrapper[4870]: I0130 09:37:52.976335 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-n5xzk_c3c8ba60-0b0f-4f22-9e7e-99b0dbc05ec1/cert-manager-webhook/0.log" Jan 30 09:37:54 crc kubenswrapper[4870]: I0130 09:37:54.936059 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerStarted","Data":"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1"} Jan 30 09:37:55 crc kubenswrapper[4870]: I0130 09:37:55.975032 4870 generic.go:334] "Generic (PLEG): container finished" podID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerID="9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1" exitCode=0 Jan 30 09:37:55 crc kubenswrapper[4870]: I0130 09:37:55.975507 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerDied","Data":"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1"} Jan 30 09:37:56 crc kubenswrapper[4870]: I0130 09:37:56.074869 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:37:56 crc kubenswrapper[4870]: I0130 09:37:56.987439 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277"} Jan 30 09:37:56 crc kubenswrapper[4870]: I0130 09:37:56.990101 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerStarted","Data":"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0"} Jan 30 09:38:01 crc kubenswrapper[4870]: I0130 09:38:01.440698 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:01 crc kubenswrapper[4870]: I0130 09:38:01.441353 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:01 crc kubenswrapper[4870]: I0130 09:38:01.489831 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:01 crc kubenswrapper[4870]: I0130 09:38:01.516295 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84sjp" podStartSLOduration=6.636568659 podStartE2EDuration="10.516273114s" podCreationTimestamp="2026-01-30 09:37:51 +0000 UTC" firstStartedPulling="2026-01-30 09:37:52.916541973 +0000 UTC m=+5311.612089072" lastFinishedPulling="2026-01-30 09:37:56.796246408 +0000 UTC m=+5315.491793527" observedRunningTime="2026-01-30 09:37:57.028435504 +0000 UTC m=+5315.723982623" watchObservedRunningTime="2026-01-30 09:38:01.516273114 +0000 UTC m=+5320.211820223" Jan 30 09:38:02 crc kubenswrapper[4870]: I0130 09:38:02.095329 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:02 crc kubenswrapper[4870]: I0130 09:38:02.167664 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.053341 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84sjp" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="registry-server" containerID="cri-o://20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0" gracePeriod=2 Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.598311 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.698751 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f79kd\" (UniqueName: \"kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd\") pod \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.698919 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities\") pod \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.699035 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content\") pod \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\" (UID: \"9048c280-ecbd-4fcb-ac6f-4b095c6e3748\") " Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.701180 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities" (OuterVolumeSpecName: "utilities") pod "9048c280-ecbd-4fcb-ac6f-4b095c6e3748" (UID: "9048c280-ecbd-4fcb-ac6f-4b095c6e3748"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.707028 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd" (OuterVolumeSpecName: "kube-api-access-f79kd") pod "9048c280-ecbd-4fcb-ac6f-4b095c6e3748" (UID: "9048c280-ecbd-4fcb-ac6f-4b095c6e3748"). InnerVolumeSpecName "kube-api-access-f79kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.768769 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9048c280-ecbd-4fcb-ac6f-4b095c6e3748" (UID: "9048c280-ecbd-4fcb-ac6f-4b095c6e3748"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.802633 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f79kd\" (UniqueName: \"kubernetes.io/projected/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-kube-api-access-f79kd\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.802683 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:04 crc kubenswrapper[4870]: I0130 09:38:04.802699 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9048c280-ecbd-4fcb-ac6f-4b095c6e3748-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.066848 4870 generic.go:334] "Generic (PLEG): container finished" podID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerID="20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0" exitCode=0 Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.066922 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerDied","Data":"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0"} Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.067173 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84sjp" event={"ID":"9048c280-ecbd-4fcb-ac6f-4b095c6e3748","Type":"ContainerDied","Data":"317d6101c3669f63b3e2482610f5f0422019c2ced984517b4c9c94f58e751f84"} Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.067194 4870 scope.go:117] "RemoveContainer" containerID="20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.066990 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84sjp" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.100795 4870 scope.go:117] "RemoveContainer" containerID="9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.117093 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.128339 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84sjp"] Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.611544 4870 scope.go:117] "RemoveContainer" containerID="3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.661386 4870 scope.go:117] "RemoveContainer" containerID="20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0" Jan 30 09:38:05 crc kubenswrapper[4870]: E0130 09:38:05.662149 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0\": container with ID starting with 20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0 not found: ID does not exist" containerID="20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.662177 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0"} err="failed to get container status \"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0\": rpc error: code = NotFound desc = could not find container \"20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0\": container with ID starting with 20b69832fce8a3445fd0e656e0d2e965ca971155faff5aefe0571a8fb6d9d7a0 not found: ID does not exist" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.662197 4870 scope.go:117] "RemoveContainer" containerID="9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1" Jan 30 09:38:05 crc kubenswrapper[4870]: E0130 09:38:05.662568 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1\": container with ID starting with 9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1 not found: ID does not exist" containerID="9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.662584 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1"} err="failed to get container status \"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1\": rpc error: code = NotFound desc = could not find container \"9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1\": container with ID starting with 9c563a1d2b675ea56ec79a4523094910d9938a668b434ecd3d19af4111aaa9f1 not found: ID does not exist" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.662597 4870 scope.go:117] "RemoveContainer" containerID="3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e" Jan 30 09:38:05 crc kubenswrapper[4870]: E0130 09:38:05.662884 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e\": container with ID starting with 3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e not found: ID does not exist" containerID="3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e" Jan 30 09:38:05 crc kubenswrapper[4870]: I0130 09:38:05.662981 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e"} err="failed to get container status \"3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e\": rpc error: code = NotFound desc = could not find container \"3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e\": container with ID starting with 3054befa2dad8228757064042af15055c5b09f7a8beefee27a231b2ae82c7a1e not found: ID does not exist" Jan 30 09:38:06 crc kubenswrapper[4870]: I0130 09:38:06.086430 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" path="/var/lib/kubelet/pods/9048c280-ecbd-4fcb-ac6f-4b095c6e3748/volumes" Jan 30 09:38:07 crc kubenswrapper[4870]: I0130 09:38:07.073055 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-ql9j9_b7e9a284-8b5c-4ae7-b388-3e9f907082d2/nmstate-console-plugin/0.log" Jan 30 09:38:07 crc kubenswrapper[4870]: I0130 09:38:07.235681 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tnl9h_f38692e7-8fd1-48e1-ab3b-07cbac975021/nmstate-handler/0.log" Jan 30 09:38:07 crc kubenswrapper[4870]: I0130 09:38:07.439390 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xdc74_86d16b9b-390e-442a-a74f-a9e32e92da59/nmstate-metrics/0.log" Jan 30 09:38:07 crc kubenswrapper[4870]: I0130 09:38:07.440140 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xdc74_86d16b9b-390e-442a-a74f-a9e32e92da59/kube-rbac-proxy/0.log" Jan 30 09:38:08 crc kubenswrapper[4870]: I0130 09:38:08.004743 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-sf8qk_bdb3e88d-691c-478c-ab03-cc84b8e04ea6/nmstate-operator/0.log" Jan 30 09:38:08 crc kubenswrapper[4870]: I0130 09:38:08.074458 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-rsk45_06799197-023a-4ed3-a378-9a1fbf25fda2/nmstate-webhook/0.log" Jan 30 09:38:21 crc kubenswrapper[4870]: I0130 09:38:21.157711 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hj2pn_614f63fc-ed66-41bb-b9fe-4229b3b67f50/prometheus-operator/0.log" Jan 30 09:38:21 crc kubenswrapper[4870]: I0130 09:38:21.280995 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf_1b8b459d-7a00-4e96-8916-4edd9fc87b99/prometheus-operator-admission-webhook/0.log" Jan 30 09:38:21 crc kubenswrapper[4870]: I0130 09:38:21.327975 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bf5558b74-9clj8_586011b7-bc23-4a41-8795-bc28910cd170/prometheus-operator-admission-webhook/0.log" Jan 30 09:38:21 crc kubenswrapper[4870]: I0130 09:38:21.504134 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tmzq2_962cb597-f461-4983-b37a-a4c9e545f7d8/perses-operator/0.log" Jan 30 09:38:21 crc kubenswrapper[4870]: I0130 09:38:21.511945 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lv4dk_0f7d84eb-b450-4168-b207-22520fed3fd3/operator/0.log" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.645432 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:26 crc kubenswrapper[4870]: E0130 09:38:26.646648 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="extract-content" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.646667 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="extract-content" Jan 30 09:38:26 crc kubenswrapper[4870]: E0130 09:38:26.646690 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="extract-utilities" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.646699 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="extract-utilities" Jan 30 09:38:26 crc kubenswrapper[4870]: E0130 09:38:26.647129 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="registry-server" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.647143 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="registry-server" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.647417 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="9048c280-ecbd-4fcb-ac6f-4b095c6e3748" containerName="registry-server" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.649298 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.655263 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.768705 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.768813 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkvq\" (UniqueName: \"kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.768832 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.871990 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.872174 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzkvq\" (UniqueName: \"kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.872243 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.872719 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.873017 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.905391 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzkvq\" (UniqueName: \"kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq\") pod \"community-operators-gvzmr\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:26 crc kubenswrapper[4870]: I0130 09:38:26.966935 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:27 crc kubenswrapper[4870]: I0130 09:38:27.555431 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:28 crc kubenswrapper[4870]: I0130 09:38:28.289639 4870 generic.go:334] "Generic (PLEG): container finished" podID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerID="d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6" exitCode=0 Jan 30 09:38:28 crc kubenswrapper[4870]: I0130 09:38:28.289752 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerDied","Data":"d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6"} Jan 30 09:38:28 crc kubenswrapper[4870]: I0130 09:38:28.290733 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerStarted","Data":"b44c001080442c74c291f4234f373f7dcd75b3d230046d851e0075a1d404593a"} Jan 30 09:38:28 crc kubenswrapper[4870]: I0130 09:38:28.291754 4870 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 09:38:30 crc kubenswrapper[4870]: I0130 09:38:30.309520 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerStarted","Data":"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a"} Jan 30 09:38:31 crc kubenswrapper[4870]: I0130 09:38:31.320577 4870 generic.go:334] "Generic (PLEG): container finished" podID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerID="25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a" exitCode=0 Jan 30 09:38:31 crc kubenswrapper[4870]: I0130 09:38:31.320745 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerDied","Data":"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a"} Jan 30 09:38:32 crc kubenswrapper[4870]: I0130 09:38:32.333187 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerStarted","Data":"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747"} Jan 30 09:38:32 crc kubenswrapper[4870]: I0130 09:38:32.356199 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gvzmr" podStartSLOduration=2.83506848 podStartE2EDuration="6.356178268s" podCreationTimestamp="2026-01-30 09:38:26 +0000 UTC" firstStartedPulling="2026-01-30 09:38:28.291485252 +0000 UTC m=+5346.987032361" lastFinishedPulling="2026-01-30 09:38:31.81259504 +0000 UTC m=+5350.508142149" observedRunningTime="2026-01-30 09:38:32.351161931 +0000 UTC m=+5351.046709040" watchObservedRunningTime="2026-01-30 09:38:32.356178268 +0000 UTC m=+5351.051725377" Jan 30 09:38:36 crc kubenswrapper[4870]: I0130 09:38:36.795893 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2dwrk_b8c43bdb-2bfa-445b-9526-a03eb3f3ca20/kube-rbac-proxy/0.log" Jan 30 09:38:36 crc kubenswrapper[4870]: I0130 09:38:36.867278 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2dwrk_b8c43bdb-2bfa-445b-9526-a03eb3f3ca20/controller/0.log" Jan 30 09:38:36 crc kubenswrapper[4870]: I0130 09:38:36.967682 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:36 crc kubenswrapper[4870]: I0130 09:38:36.967740 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.004493 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-pnddd_5d3d6557-5b19-47c3-9e81-09b8dee3b239/frr-k8s-webhook-server/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.023508 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.094774 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-frr-files/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.267909 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-reloader/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.274330 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-frr-files/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.307910 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-reloader/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.329666 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-metrics/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.437550 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.492368 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.509095 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-reloader/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.553480 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-metrics/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.608978 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-frr-files/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.617213 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-metrics/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.806536 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-metrics/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.812232 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/controller/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.836488 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-frr-files/0.log" Jan 30 09:38:37 crc kubenswrapper[4870]: I0130 09:38:37.839445 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/cp-reloader/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.008095 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/frr-metrics/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.038567 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/kube-rbac-proxy/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.045227 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/kube-rbac-proxy-frr/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.226723 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/reloader/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.317806 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-567987c4fc-ff527_70a9e498-4f2a-40ff-8837-7811ffe26e2d/manager/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.567939 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d5db5fbbd-k8pwt_f01bc9ba-9427-4c0a-927e-56b20aca72c5/webhook-server/0.log" Jan 30 09:38:38 crc kubenswrapper[4870]: I0130 09:38:38.803843 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7q5pn_84099c66-a13e-4949-ae36-7fa85a6a6a56/kube-rbac-proxy/0.log" Jan 30 09:38:39 crc kubenswrapper[4870]: I0130 09:38:39.316622 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7q5pn_84099c66-a13e-4949-ae36-7fa85a6a6a56/speaker/0.log" Jan 30 09:38:39 crc kubenswrapper[4870]: I0130 09:38:39.397691 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gvzmr" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="registry-server" containerID="cri-o://7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747" gracePeriod=2 Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.000677 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.022405 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zwhkv_008f589d-dab4-42af-9a42-cb6c00737f44/frr/0.log" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.072183 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzkvq\" (UniqueName: \"kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq\") pod \"2fe021a5-6534-4aad-aabc-da82e18587d6\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.072230 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content\") pod \"2fe021a5-6534-4aad-aabc-da82e18587d6\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.072363 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities\") pod \"2fe021a5-6534-4aad-aabc-da82e18587d6\" (UID: \"2fe021a5-6534-4aad-aabc-da82e18587d6\") " Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.074641 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities" (OuterVolumeSpecName: "utilities") pod "2fe021a5-6534-4aad-aabc-da82e18587d6" (UID: "2fe021a5-6534-4aad-aabc-da82e18587d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.089968 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq" (OuterVolumeSpecName: "kube-api-access-hzkvq") pod "2fe021a5-6534-4aad-aabc-da82e18587d6" (UID: "2fe021a5-6534-4aad-aabc-da82e18587d6"). InnerVolumeSpecName "kube-api-access-hzkvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.145995 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fe021a5-6534-4aad-aabc-da82e18587d6" (UID: "2fe021a5-6534-4aad-aabc-da82e18587d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.175281 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzkvq\" (UniqueName: \"kubernetes.io/projected/2fe021a5-6534-4aad-aabc-da82e18587d6-kube-api-access-hzkvq\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.175327 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.175342 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe021a5-6534-4aad-aabc-da82e18587d6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.408605 4870 generic.go:334] "Generic (PLEG): container finished" podID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerID="7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747" exitCode=0 Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.408678 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerDied","Data":"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747"} Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.409085 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gvzmr" event={"ID":"2fe021a5-6534-4aad-aabc-da82e18587d6","Type":"ContainerDied","Data":"b44c001080442c74c291f4234f373f7dcd75b3d230046d851e0075a1d404593a"} Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.409116 4870 scope.go:117] "RemoveContainer" containerID="7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.408711 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gvzmr" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.431570 4870 scope.go:117] "RemoveContainer" containerID="25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.451393 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.465851 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gvzmr"] Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.470008 4870 scope.go:117] "RemoveContainer" containerID="d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.543257 4870 scope.go:117] "RemoveContainer" containerID="7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747" Jan 30 09:38:40 crc kubenswrapper[4870]: E0130 09:38:40.543747 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747\": container with ID starting with 7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747 not found: ID does not exist" containerID="7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.543791 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747"} err="failed to get container status \"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747\": rpc error: code = NotFound desc = could not find container \"7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747\": container with ID starting with 7a92b1340513f77963339b6db5fd2c460124ab2a79c0e1a016074e513a325747 not found: ID does not exist" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.543818 4870 scope.go:117] "RemoveContainer" containerID="25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a" Jan 30 09:38:40 crc kubenswrapper[4870]: E0130 09:38:40.545324 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a\": container with ID starting with 25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a not found: ID does not exist" containerID="25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.545359 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a"} err="failed to get container status \"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a\": rpc error: code = NotFound desc = could not find container \"25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a\": container with ID starting with 25bbe5473f8721e2d80baefa6abb9b0a74613b494e6b78293367482ab21a0a2a not found: ID does not exist" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.545378 4870 scope.go:117] "RemoveContainer" containerID="d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6" Jan 30 09:38:40 crc kubenswrapper[4870]: E0130 09:38:40.545719 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6\": container with ID starting with d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6 not found: ID does not exist" containerID="d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6" Jan 30 09:38:40 crc kubenswrapper[4870]: I0130 09:38:40.545777 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6"} err="failed to get container status \"d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6\": rpc error: code = NotFound desc = could not find container \"d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6\": container with ID starting with d4c68d2b61af4455d80436ab2cd4e5bb21e57b71e94a4b798fd7f012135b08c6 not found: ID does not exist" Jan 30 09:38:42 crc kubenswrapper[4870]: I0130 09:38:42.092780 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" path="/var/lib/kubelet/pods/2fe021a5-6534-4aad-aabc-da82e18587d6/volumes" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.119868 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/util/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.396895 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/util/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.400489 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/pull/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.400578 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/pull/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.586542 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/extract/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.624300 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/pull/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.641323 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc5k6pp_f4f12d31-27df-4d6e-a1cb-64ed2b79d9ef/util/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.785621 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/util/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.953186 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/pull/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.962256 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/pull/0.log" Jan 30 09:38:53 crc kubenswrapper[4870]: I0130 09:38:53.984373 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/util/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.136355 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/util/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.146602 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/pull/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.179229 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713s5s5m_e702b53f-5799-4595-b78f-35717f81379f/extract/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.655946 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/util/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.866527 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/util/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.910846 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/pull/0.log" Jan 30 09:38:54 crc kubenswrapper[4870]: I0130 09:38:54.916652 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/pull/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.085332 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/pull/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.107015 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/util/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.133445 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkw7m_69895f16-2797-4fd7-aedf-54fc47cd2c4f/extract/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.272202 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-utilities/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.456407 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-content/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.458086 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-utilities/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.486701 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-content/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.611362 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-utilities/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.678115 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/extract-content/0.log" Jan 30 09:38:55 crc kubenswrapper[4870]: I0130 09:38:55.831803 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-utilities/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.022825 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-utilities/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.079951 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-content/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.094657 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-content/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.284514 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-whfhw_ea80eb92-6881-4e69-8ca2-050d32254eb7/registry-server/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.756325 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-utilities/0.log" Jan 30 09:38:56 crc kubenswrapper[4870]: I0130 09:38:56.792099 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/extract-content/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.097766 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-utilities/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.112358 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vkhzd_83d46dd9-5ab7-44c9-b032-1241911b6d82/marketplace-operator/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.386314 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-utilities/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.396971 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-content/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.406490 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-content/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.616784 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-utilities/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.758752 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/extract-content/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.837335 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mqxgq_d7b3d065-5057-49c1-be84-7880d7d4d619/registry-server/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.886992 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-utilities/0.log" Jan 30 09:38:57 crc kubenswrapper[4870]: I0130 09:38:57.904021 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-glxrr_b1839882-74e1-4c94-9d83-849d10c41089/registry-server/0.log" Jan 30 09:38:58 crc kubenswrapper[4870]: I0130 09:38:58.030133 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-utilities/0.log" Jan 30 09:38:58 crc kubenswrapper[4870]: I0130 09:38:58.030390 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-content/0.log" Jan 30 09:38:58 crc kubenswrapper[4870]: I0130 09:38:58.078186 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-content/0.log" Jan 30 09:38:58 crc kubenswrapper[4870]: I0130 09:38:58.216820 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-utilities/0.log" Jan 30 09:38:58 crc kubenswrapper[4870]: I0130 09:38:58.252327 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/extract-content/0.log" Jan 30 09:38:59 crc kubenswrapper[4870]: I0130 09:38:59.074893 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dmqx_71b77216-d7c7-4a69-8596-e64fd99129c6/registry-server/0.log" Jan 30 09:39:11 crc kubenswrapper[4870]: I0130 09:39:11.465619 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bf5558b74-9clj8_586011b7-bc23-4a41-8795-bc28910cd170/prometheus-operator-admission-webhook/0.log" Jan 30 09:39:11 crc kubenswrapper[4870]: I0130 09:39:11.468045 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bf5558b74-8wxnf_1b8b459d-7a00-4e96-8916-4edd9fc87b99/prometheus-operator-admission-webhook/0.log" Jan 30 09:39:11 crc kubenswrapper[4870]: I0130 09:39:11.477654 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-hj2pn_614f63fc-ed66-41bb-b9fe-4229b3b67f50/prometheus-operator/0.log" Jan 30 09:39:11 crc kubenswrapper[4870]: I0130 09:39:11.684666 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lv4dk_0f7d84eb-b450-4168-b207-22520fed3fd3/operator/0.log" Jan 30 09:39:11 crc kubenswrapper[4870]: I0130 09:39:11.691621 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tmzq2_962cb597-f461-4983-b37a-a4c9e545f7d8/perses-operator/0.log" Jan 30 09:40:25 crc kubenswrapper[4870]: I0130 09:40:25.249474 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:40:25 crc kubenswrapper[4870]: I0130 09:40:25.250023 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:40:55 crc kubenswrapper[4870]: I0130 09:40:55.249273 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:40:55 crc kubenswrapper[4870]: I0130 09:40:55.251964 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:41:22 crc kubenswrapper[4870]: I0130 09:41:22.067803 4870 generic.go:334] "Generic (PLEG): container finished" podID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerID="6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879" exitCode=0 Jan 30 09:41:22 crc kubenswrapper[4870]: I0130 09:41:22.067902 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" event={"ID":"6769b74f-20a7-48a8-b39b-d812418dbab4","Type":"ContainerDied","Data":"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879"} Jan 30 09:41:22 crc kubenswrapper[4870]: I0130 09:41:22.069133 4870 scope.go:117] "RemoveContainer" containerID="6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879" Jan 30 09:41:23 crc kubenswrapper[4870]: I0130 09:41:23.069714 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ngvkt_must-gather-jl6kn_6769b74f-20a7-48a8-b39b-d812418dbab4/gather/0.log" Jan 30 09:41:25 crc kubenswrapper[4870]: I0130 09:41:25.250395 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:41:25 crc kubenswrapper[4870]: I0130 09:41:25.250666 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:41:25 crc kubenswrapper[4870]: I0130 09:41:25.250706 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:41:25 crc kubenswrapper[4870]: I0130 09:41:25.251466 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:41:25 crc kubenswrapper[4870]: I0130 09:41:25.251519 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277" gracePeriod=600 Jan 30 09:41:26 crc kubenswrapper[4870]: I0130 09:41:26.107250 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277" exitCode=0 Jan 30 09:41:26 crc kubenswrapper[4870]: I0130 09:41:26.107334 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277"} Jan 30 09:41:26 crc kubenswrapper[4870]: I0130 09:41:26.108316 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerStarted","Data":"6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f"} Jan 30 09:41:26 crc kubenswrapper[4870]: I0130 09:41:26.108493 4870 scope.go:117] "RemoveContainer" containerID="7f9a0e2b7a855a7780193a1b5ddeb2a0d89688bd487b6861d08256fc31412f40" Jan 30 09:41:31 crc kubenswrapper[4870]: I0130 09:41:31.569268 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ngvkt/must-gather-jl6kn"] Jan 30 09:41:31 crc kubenswrapper[4870]: I0130 09:41:31.570195 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="copy" containerID="cri-o://94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868" gracePeriod=2 Jan 30 09:41:31 crc kubenswrapper[4870]: I0130 09:41:31.580425 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ngvkt/must-gather-jl6kn"] Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.067223 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ngvkt_must-gather-jl6kn_6769b74f-20a7-48a8-b39b-d812418dbab4/copy/0.log" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.067965 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.181835 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltpzc\" (UniqueName: \"kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc\") pod \"6769b74f-20a7-48a8-b39b-d812418dbab4\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.181939 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output\") pod \"6769b74f-20a7-48a8-b39b-d812418dbab4\" (UID: \"6769b74f-20a7-48a8-b39b-d812418dbab4\") " Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.203861 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc" (OuterVolumeSpecName: "kube-api-access-ltpzc") pod "6769b74f-20a7-48a8-b39b-d812418dbab4" (UID: "6769b74f-20a7-48a8-b39b-d812418dbab4"). InnerVolumeSpecName "kube-api-access-ltpzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.209019 4870 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ngvkt_must-gather-jl6kn_6769b74f-20a7-48a8-b39b-d812418dbab4/copy/0.log" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.215533 4870 generic.go:334] "Generic (PLEG): container finished" podID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerID="94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868" exitCode=143 Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.215602 4870 scope.go:117] "RemoveContainer" containerID="94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.215611 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngvkt/must-gather-jl6kn" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.253646 4870 scope.go:117] "RemoveContainer" containerID="6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.289150 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltpzc\" (UniqueName: \"kubernetes.io/projected/6769b74f-20a7-48a8-b39b-d812418dbab4-kube-api-access-ltpzc\") on node \"crc\" DevicePath \"\"" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.343296 4870 scope.go:117] "RemoveContainer" containerID="94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868" Jan 30 09:41:32 crc kubenswrapper[4870]: E0130 09:41:32.343656 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868\": container with ID starting with 94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868 not found: ID does not exist" containerID="94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.343703 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868"} err="failed to get container status \"94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868\": rpc error: code = NotFound desc = could not find container \"94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868\": container with ID starting with 94b1b42bc8f7c336d8fcdcb61cfce9aebe53e0b19d58671be8fe96811a489868 not found: ID does not exist" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.343732 4870 scope.go:117] "RemoveContainer" containerID="6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879" Jan 30 09:41:32 crc kubenswrapper[4870]: E0130 09:41:32.344027 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879\": container with ID starting with 6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879 not found: ID does not exist" containerID="6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.344056 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879"} err="failed to get container status \"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879\": rpc error: code = NotFound desc = could not find container \"6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879\": container with ID starting with 6fb7d80f8c08e636076085f11ddc05d718f9baf6df6e1017b15952720a016879 not found: ID does not exist" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.438971 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6769b74f-20a7-48a8-b39b-d812418dbab4" (UID: "6769b74f-20a7-48a8-b39b-d812418dbab4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:41:32 crc kubenswrapper[4870]: I0130 09:41:32.493313 4870 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6769b74f-20a7-48a8-b39b-d812418dbab4-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 09:41:34 crc kubenswrapper[4870]: I0130 09:41:34.098539 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" path="/var/lib/kubelet/pods/6769b74f-20a7-48a8-b39b-d812418dbab4/volumes" Jan 30 09:41:39 crc kubenswrapper[4870]: I0130 09:41:39.660978 4870 scope.go:117] "RemoveContainer" containerID="c10769db3e913e00355f8966e729b5b6b9071d652adf30ce568e54dc81b0dfbf" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.765416 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:15 crc kubenswrapper[4870]: E0130 09:43:15.767138 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="copy" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767155 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="copy" Jan 30 09:43:15 crc kubenswrapper[4870]: E0130 09:43:15.767177 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="extract-content" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767185 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="extract-content" Jan 30 09:43:15 crc kubenswrapper[4870]: E0130 09:43:15.767204 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="extract-utilities" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767215 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="extract-utilities" Jan 30 09:43:15 crc kubenswrapper[4870]: E0130 09:43:15.767226 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="registry-server" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767233 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="registry-server" Jan 30 09:43:15 crc kubenswrapper[4870]: E0130 09:43:15.767253 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="gather" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767260 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="gather" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767569 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="gather" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767591 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe021a5-6534-4aad-aabc-da82e18587d6" containerName="registry-server" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.767614 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="6769b74f-20a7-48a8-b39b-d812418dbab4" containerName="copy" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.769512 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.778276 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.822186 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.822534 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xbj\" (UniqueName: \"kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.822667 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.924397 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.924613 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.924673 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xbj\" (UniqueName: \"kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.925257 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.925387 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:15 crc kubenswrapper[4870]: I0130 09:43:15.951198 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xbj\" (UniqueName: \"kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj\") pod \"redhat-marketplace-z2j9g\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:16 crc kubenswrapper[4870]: I0130 09:43:16.099951 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:16 crc kubenswrapper[4870]: I0130 09:43:16.399097 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:17 crc kubenswrapper[4870]: I0130 09:43:17.278519 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerID="5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f" exitCode=0 Jan 30 09:43:17 crc kubenswrapper[4870]: I0130 09:43:17.278753 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerDied","Data":"5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f"} Jan 30 09:43:17 crc kubenswrapper[4870]: I0130 09:43:17.278916 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerStarted","Data":"ce47c6d811f8804946b6cd93e6bc8bd97e403fdb31b9759bb3ce8bab5f26510f"} Jan 30 09:43:18 crc kubenswrapper[4870]: I0130 09:43:18.292798 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerStarted","Data":"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466"} Jan 30 09:43:19 crc kubenswrapper[4870]: I0130 09:43:19.304416 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerID="bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466" exitCode=0 Jan 30 09:43:19 crc kubenswrapper[4870]: I0130 09:43:19.305806 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerDied","Data":"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466"} Jan 30 09:43:20 crc kubenswrapper[4870]: I0130 09:43:20.319332 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerStarted","Data":"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5"} Jan 30 09:43:20 crc kubenswrapper[4870]: I0130 09:43:20.352814 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z2j9g" podStartSLOduration=2.93440147 podStartE2EDuration="5.352791027s" podCreationTimestamp="2026-01-30 09:43:15 +0000 UTC" firstStartedPulling="2026-01-30 09:43:17.281697332 +0000 UTC m=+5635.977244441" lastFinishedPulling="2026-01-30 09:43:19.700086859 +0000 UTC m=+5638.395633998" observedRunningTime="2026-01-30 09:43:20.345265032 +0000 UTC m=+5639.040812141" watchObservedRunningTime="2026-01-30 09:43:20.352791027 +0000 UTC m=+5639.048338156" Jan 30 09:43:25 crc kubenswrapper[4870]: I0130 09:43:25.250225 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:43:25 crc kubenswrapper[4870]: I0130 09:43:25.251157 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:43:26 crc kubenswrapper[4870]: I0130 09:43:26.100448 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:26 crc kubenswrapper[4870]: I0130 09:43:26.100808 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:26 crc kubenswrapper[4870]: I0130 09:43:26.145460 4870 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:26 crc kubenswrapper[4870]: I0130 09:43:26.474218 4870 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:26 crc kubenswrapper[4870]: I0130 09:43:26.525786 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.403563 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z2j9g" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="registry-server" containerID="cri-o://ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5" gracePeriod=2 Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.872038 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.907570 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities\") pod \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.907761 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content\") pod \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.908082 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88xbj\" (UniqueName: \"kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj\") pod \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\" (UID: \"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf\") " Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.909222 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities" (OuterVolumeSpecName: "utilities") pod "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" (UID: "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.913638 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj" (OuterVolumeSpecName: "kube-api-access-88xbj") pod "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" (UID: "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf"). InnerVolumeSpecName "kube-api-access-88xbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:43:28 crc kubenswrapper[4870]: I0130 09:43:28.969358 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" (UID: "2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.011077 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88xbj\" (UniqueName: \"kubernetes.io/projected/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-kube-api-access-88xbj\") on node \"crc\" DevicePath \"\"" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.011120 4870 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.011134 4870 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.417380 4870 generic.go:334] "Generic (PLEG): container finished" podID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerID="ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5" exitCode=0 Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.417459 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerDied","Data":"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5"} Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.417488 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2j9g" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.417684 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2j9g" event={"ID":"2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf","Type":"ContainerDied","Data":"ce47c6d811f8804946b6cd93e6bc8bd97e403fdb31b9759bb3ce8bab5f26510f"} Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.417701 4870 scope.go:117] "RemoveContainer" containerID="ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.437343 4870 scope.go:117] "RemoveContainer" containerID="bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.457749 4870 scope.go:117] "RemoveContainer" containerID="5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.538187 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.546975 4870 scope.go:117] "RemoveContainer" containerID="ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5" Jan 30 09:43:29 crc kubenswrapper[4870]: E0130 09:43:29.547476 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5\": container with ID starting with ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5 not found: ID does not exist" containerID="ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.547527 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5"} err="failed to get container status \"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5\": rpc error: code = NotFound desc = could not find container \"ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5\": container with ID starting with ca8c25f1254aa5ab54be39ce1a4ac849e05021e88a045e105c579b053b2866d5 not found: ID does not exist" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.547560 4870 scope.go:117] "RemoveContainer" containerID="bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466" Jan 30 09:43:29 crc kubenswrapper[4870]: E0130 09:43:29.548112 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466\": container with ID starting with bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466 not found: ID does not exist" containerID="bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.548138 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466"} err="failed to get container status \"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466\": rpc error: code = NotFound desc = could not find container \"bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466\": container with ID starting with bf7407dac474c4d4117f3382db3f03680b3b99f686b344d5f1e5b6698ae64466 not found: ID does not exist" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.548156 4870 scope.go:117] "RemoveContainer" containerID="5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f" Jan 30 09:43:29 crc kubenswrapper[4870]: E0130 09:43:29.548642 4870 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f\": container with ID starting with 5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f not found: ID does not exist" containerID="5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.548677 4870 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f"} err="failed to get container status \"5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f\": rpc error: code = NotFound desc = could not find container \"5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f\": container with ID starting with 5cfc010005068388a1db08f2050d5e3b191c08b875c5d8cb2d42dbb809ed767f not found: ID does not exist" Jan 30 09:43:29 crc kubenswrapper[4870]: I0130 09:43:29.550996 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2j9g"] Jan 30 09:43:30 crc kubenswrapper[4870]: I0130 09:43:30.086283 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" path="/var/lib/kubelet/pods/2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf/volumes" Jan 30 09:43:55 crc kubenswrapper[4870]: I0130 09:43:55.249304 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:43:55 crc kubenswrapper[4870]: I0130 09:43:55.250025 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.249683 4870 patch_prober.go:28] interesting pod/machine-config-daemon-j4sd8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.250402 4870 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.250456 4870 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.251120 4870 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f"} pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.251175 4870 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerName="machine-config-daemon" containerID="cri-o://6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" gracePeriod=600 Jan 30 09:44:25 crc kubenswrapper[4870]: E0130 09:44:25.385979 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.998806 4870 generic.go:334] "Generic (PLEG): container finished" podID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" exitCode=0 Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.998865 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" event={"ID":"5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d","Type":"ContainerDied","Data":"6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f"} Jan 30 09:44:25 crc kubenswrapper[4870]: I0130 09:44:25.998938 4870 scope.go:117] "RemoveContainer" containerID="47da0905bfb9191daf20cd74642a93fa3691ef9a29f1507f03d471be1698c277" Jan 30 09:44:26 crc kubenswrapper[4870]: I0130 09:44:26.000404 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:44:26 crc kubenswrapper[4870]: E0130 09:44:26.001421 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:44:37 crc kubenswrapper[4870]: I0130 09:44:37.074867 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:44:37 crc kubenswrapper[4870]: E0130 09:44:37.076013 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:44:49 crc kubenswrapper[4870]: I0130 09:44:49.075404 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:44:49 crc kubenswrapper[4870]: E0130 09:44:49.076759 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.075284 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:45:00 crc kubenswrapper[4870]: E0130 09:45:00.077536 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.161966 4870 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx"] Jan 30 09:45:00 crc kubenswrapper[4870]: E0130 09:45:00.162513 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="registry-server" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.162539 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="registry-server" Jan 30 09:45:00 crc kubenswrapper[4870]: E0130 09:45:00.162556 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="extract-content" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.162564 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="extract-content" Jan 30 09:45:00 crc kubenswrapper[4870]: E0130 09:45:00.162585 4870 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="extract-utilities" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.162593 4870 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="extract-utilities" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.162847 4870 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed1bee2-d571-4dcc-8ba9-f4257bb5dacf" containerName="registry-server" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.163738 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.168634 4870 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.168642 4870 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.175416 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx"] Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.211860 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.212050 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.212338 4870 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwnwp\" (UniqueName: \"kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.314304 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.314356 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.314423 4870 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwnwp\" (UniqueName: \"kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.315658 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.335981 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.341099 4870 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwnwp\" (UniqueName: \"kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp\") pod \"collect-profiles-29496105-jk4bx\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.486778 4870 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:00 crc kubenswrapper[4870]: I0130 09:45:00.936888 4870 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx"] Jan 30 09:45:01 crc kubenswrapper[4870]: I0130 09:45:01.452749 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" event={"ID":"54d93b9e-2b65-40bb-81f7-134f3ce2d101","Type":"ContainerStarted","Data":"202b2d053a72796ab3db27f1b87308f2dcc900d8af762a773e5d8fc1878c82e2"} Jan 30 09:45:01 crc kubenswrapper[4870]: I0130 09:45:01.453395 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" event={"ID":"54d93b9e-2b65-40bb-81f7-134f3ce2d101","Type":"ContainerStarted","Data":"50620433ec0edc768d17bbe6ad833e170a77e2cd498e1f06578e383e29a8cb5d"} Jan 30 09:45:01 crc kubenswrapper[4870]: I0130 09:45:01.490867 4870 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" podStartSLOduration=1.490848613 podStartE2EDuration="1.490848613s" podCreationTimestamp="2026-01-30 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 09:45:01.476771343 +0000 UTC m=+5740.172318462" watchObservedRunningTime="2026-01-30 09:45:01.490848613 +0000 UTC m=+5740.186395722" Jan 30 09:45:02 crc kubenswrapper[4870]: I0130 09:45:02.462864 4870 generic.go:334] "Generic (PLEG): container finished" podID="54d93b9e-2b65-40bb-81f7-134f3ce2d101" containerID="202b2d053a72796ab3db27f1b87308f2dcc900d8af762a773e5d8fc1878c82e2" exitCode=0 Jan 30 09:45:02 crc kubenswrapper[4870]: I0130 09:45:02.462922 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" event={"ID":"54d93b9e-2b65-40bb-81f7-134f3ce2d101","Type":"ContainerDied","Data":"202b2d053a72796ab3db27f1b87308f2dcc900d8af762a773e5d8fc1878c82e2"} Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.849940 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.893509 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwnwp\" (UniqueName: \"kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp\") pod \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.893659 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume\") pod \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.893714 4870 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume\") pod \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\" (UID: \"54d93b9e-2b65-40bb-81f7-134f3ce2d101\") " Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.894603 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume" (OuterVolumeSpecName: "config-volume") pod "54d93b9e-2b65-40bb-81f7-134f3ce2d101" (UID: "54d93b9e-2b65-40bb-81f7-134f3ce2d101"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.899533 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp" (OuterVolumeSpecName: "kube-api-access-xwnwp") pod "54d93b9e-2b65-40bb-81f7-134f3ce2d101" (UID: "54d93b9e-2b65-40bb-81f7-134f3ce2d101"). InnerVolumeSpecName "kube-api-access-xwnwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.899764 4870 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54d93b9e-2b65-40bb-81f7-134f3ce2d101" (UID: "54d93b9e-2b65-40bb-81f7-134f3ce2d101"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.996203 4870 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwnwp\" (UniqueName: \"kubernetes.io/projected/54d93b9e-2b65-40bb-81f7-134f3ce2d101-kube-api-access-xwnwp\") on node \"crc\" DevicePath \"\"" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.996238 4870 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54d93b9e-2b65-40bb-81f7-134f3ce2d101-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:45:03 crc kubenswrapper[4870]: I0130 09:45:03.996249 4870 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54d93b9e-2b65-40bb-81f7-134f3ce2d101-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 09:45:04 crc kubenswrapper[4870]: I0130 09:45:04.486400 4870 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" event={"ID":"54d93b9e-2b65-40bb-81f7-134f3ce2d101","Type":"ContainerDied","Data":"50620433ec0edc768d17bbe6ad833e170a77e2cd498e1f06578e383e29a8cb5d"} Jan 30 09:45:04 crc kubenswrapper[4870]: I0130 09:45:04.486447 4870 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50620433ec0edc768d17bbe6ad833e170a77e2cd498e1f06578e383e29a8cb5d" Jan 30 09:45:04 crc kubenswrapper[4870]: I0130 09:45:04.486509 4870 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496105-jk4bx" Jan 30 09:45:04 crc kubenswrapper[4870]: I0130 09:45:04.577011 4870 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb"] Jan 30 09:45:04 crc kubenswrapper[4870]: I0130 09:45:04.585036 4870 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496060-gq9mb"] Jan 30 09:45:06 crc kubenswrapper[4870]: I0130 09:45:06.088510 4870 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f537a705-b98d-4cc1-8fba-f9fb4145fc33" path="/var/lib/kubelet/pods/f537a705-b98d-4cc1-8fba-f9fb4145fc33/volumes" Jan 30 09:45:14 crc kubenswrapper[4870]: I0130 09:45:14.075400 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:45:14 crc kubenswrapper[4870]: E0130 09:45:14.076188 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:45:25 crc kubenswrapper[4870]: I0130 09:45:25.074781 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:45:25 crc kubenswrapper[4870]: E0130 09:45:25.075738 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:45:39 crc kubenswrapper[4870]: I0130 09:45:39.075012 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:45:39 crc kubenswrapper[4870]: E0130 09:45:39.075781 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:45:39 crc kubenswrapper[4870]: I0130 09:45:39.804835 4870 scope.go:117] "RemoveContainer" containerID="21570fbd391aa6805bfee83f36df9ca917daf03782908d47cd7dd4eedf90e176" Jan 30 09:45:52 crc kubenswrapper[4870]: I0130 09:45:52.088777 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:45:52 crc kubenswrapper[4870]: E0130 09:45:52.092277 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:46:06 crc kubenswrapper[4870]: I0130 09:46:06.075051 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:46:06 crc kubenswrapper[4870]: E0130 09:46:06.075849 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:46:19 crc kubenswrapper[4870]: I0130 09:46:19.075241 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:46:19 crc kubenswrapper[4870]: E0130 09:46:19.077653 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:46:34 crc kubenswrapper[4870]: I0130 09:46:34.075495 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:46:34 crc kubenswrapper[4870]: E0130 09:46:34.076295 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:46:46 crc kubenswrapper[4870]: I0130 09:46:46.074804 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:46:46 crc kubenswrapper[4870]: E0130 09:46:46.076860 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" Jan 30 09:47:01 crc kubenswrapper[4870]: I0130 09:47:01.074655 4870 scope.go:117] "RemoveContainer" containerID="6306a905a69b5913903a55220b56d897c0cf89cc4b73121b61a3a52cb36ff71f" Jan 30 09:47:01 crc kubenswrapper[4870]: E0130 09:47:01.076493 4870 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j4sd8_openshift-machine-config-operator(5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d)\"" pod="openshift-machine-config-operator/machine-config-daemon-j4sd8" podUID="5d3c8db6-cf22-4fb2-ae7c-a3d544473a6d" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137077040024451 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137077040017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137063074016513 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137063075015464 5ustar corecore